99: Multithreading. Cache Lines.

Take Up Code - Un pódcast de Take Up Code: build your own computer games, apps, and robotics with podcasts and live classes

Categorías:

If you’re not careful, you can cause a processor to come to an immediate and full stop while it waits for data to move around in memory. That’s probably not the performance boost you were looking for. In order to avoid these delays, you need to be aware of two things and make sure to test your code to fully understand what it’s doing. Cache invalidation can occur when a thread running on one processor modifies a variable being used in the cache of another thread running on another processor. This causes the cache to be moved around as the processors make sure that that data is consistent. It also causes a processor to halt while it waits for the data to be updated. You can also get cache invalidation even when two threads are each working with their own variables if those variables happen to reside in the same cache line. Packing your data together so it observes locality of reference will help avoid this false sharing. You also need to make sure that you don’t divide your data that’s been packed like this between multiple threads. Listen to the full episode or you can also read the full transcript below. Transcript I’ve found few books that explain this and I actually skipped one of the patterns in the Game Programming Patterns book that explained only half the story. Let me first explain what a cache is because if you don’t understand this, then you’ll not get much benefit from this episode. A cache is a simple concept that we use all the time to speed things up. When you buy groceries, do you buy one item at a time? Do you walk in the store, pick up a loaf of bread, wait in line, pay for it, take it all the way home, and then repeat the whole process for the sugar, juice, and apples? Of course not. You put everything in a cart first. When you need to fold clothes, do you go to the dryer, take out a pair of socks, fold them, and keep them in a drawer before going back to the laundry room for another pair of socks? Of course not, you grab everything out of the dryer at once, put it all in a basket and take it somewhere to fold. These are caches and computers use the same concept. When a processor wants to get some memory values from memory, it gets more than it needs. Because chances are good that the next memory will be nearby and it might get lucky and pick up the next values along with the first. When a processor does find the value it needs in the cache, this is called a cache hit. And when it can’t, this is called a cache miss. And just like when you accidentally leave some clothes in the dryer and have to make a special trip to get them, a cache miss causes a processor to have to make another trip back to memory too. The book Game Programming Patterns describes a design pattern called Data Locality which teaches you to keep your data nearby in memory. We sometimes use linked lists which allow us to easily insert or removed items from a collection without needing to move the items in memory. Each item can remain wherever it is in memory and the collection just stores a pointer to the item. While this can definitely be a good thing, it also causes the processor to jump around in memory and that causes the processor to have to reload its cache much more often. You see, the cache is much faster than the main memory but it’s also much smaller. Being aware of this and testing your code to find the slow spots will help you to write better code. But this isn’t the whole story. When you add multithreading into your application, you need to again reconsider your approach to organizing data in memory. I’ll explain how multithreading changes things right after this message from our sponsor. ( Message from Sponsor ) Alright, you know how to protect your data from race conditions by first obtaining a lock. And you also know that it’s a good idea to get a lock first, do what you need as quick as possible, and then release the lock. You want to be quick be

Visit the podcast's native language site