Designing Windows Store Apps for Performance
- 5/5/2014
- Less is more
- Proof of concept
- Designing Windows Store Apps for Performance
- Handling a lot of content
- Handling media
- Summary
Handling a lot of content
Some apps handle large datasets. Many shopping apps make thousands of items available to the user, photo browsers let the users browse large image libraries, and social apps handle countless messages, replies, comments, and so forth. If you’re building an app that needs to handle a lot of content, there are several things to keep in mind. I go through the following topics:
- Prioritize your content.
- Partition content to reduce workload.
- Size or decode images.
- Cache information to improve transitions.
Prioritize your content
I know this repeats one of the points of the previous section, but prioritizing is a very important part of performance work in general, so it bears repeating. Every time you make some scenario run faster, you do so at the expense of something else. There’s no way every part of your app can be as fast as possible, so you need to prioritize.
If your app needs to show a lot of content, some content will be more important than other content. If you do not figure out what the important parts are and focus on those, you force the user to make this prioritization and that is not a great user experience. This is the reason news apps have top stories—they help the readers focus their attention on the important parts. Furthermore, it is virtually impossible to avoid suggesting some kind of prioritization even when you try not to. A news app that simply lists all the available stories without any embellishment still suggests priority by the order in which the stories are listed.
The key here is that priority is a useful tool that shapes the user experience. You can use this tool to your advantage. Users are looking for order, so don’t give them chaos. If your app has a lot of content, the best way to improve the user experience and the performance is to acknowledge that some of the content is going to be more important to the user and make sure this is readily available.
In practical terms, this means everything visible on the first page of the app is more important than all the off-screen items. Furthermore, you probably even want to prioritize the content of the first page as well. The new Hub control introduced in Windows 8.1 is an excellent way to do that, but there are other useful ways to achieve the same effect.
Many things affect our perception of what is important and what isn’t. Order, size, and color of elements all affect how we interpret importance. The Hub control uses all these elements to create a clear ordering of things, as you can see in Figure 3-2. On the left, a large picture attracts the user’s attention. Because many users read from left to right, both the size and the placement of this item underlines its importance. The next item in the second column is smaller, suggesting it is less important. However, it has a picture to draw attention to it, which makes it more important than the item in the third column, and so forth.
FIGURE 3-2 Mockup of a Hawaii app based on the Hub app template.
The Hub control establishes an obvious order of the elements on the page. To support this, your app must make sure the necessary resources are retrieved in the same order.
Ideally, all the important resources should be loaded locally. The app can attempt to retrieve the resources online, but the only way to guarantee good performance is to have a mechanism to fall back to local content in case network latency is too high. Your app can launch asynchronous tasks to retrieve the elements, but if these run for more than a few hundred milliseconds, the app should use cached resources instead, even if these are stale. The content can be refreshed once the resources are available. If you optimize for responsiveness, your users will be able to use the app but might not have the latest data. On the other hand, if you optimize for absolutely up-to-date content, users will get neither responsiveness nor fresh content when the network isn’t cooperating.
A common problem in this area is that the server-side APIs might not be designed to support the prioritization of the content your app needs. I have seen several apps that get all their resources in bulk. This means the important resources are bundled with the less important resources, and the result is that the app cannot do anything before everything is ready. In other words, content cannot be prioritized and the overall workload increases. That hurts performance. You need to make sure your back end supports specific queries so that your app can retrieve just the resources it needs at any given time. When your app uses a back end, performance is affected by both the app itself, the back end, and the protocol used to communicate between the two. All three must be designed with performance for the specific scenarios in mind and the maxim “less is more” applies to all of them. Make sure your app is simple; make sure you don’t send, receive, and process more data than necessary; and design your back end to support the queries your app needs to make.
Partition content to reduce workload
Very large datasets present a number of challenges. If your app needs to handle large datasets, you should spend some time identifying the maximum reasonable size. Although XAML offers several ways to help you partition the content on the screen, you still need to figure out what a manageable amount of content looks like.
Even though random access virtualization allows the user to page through huge collections of data, you need to ask yourself if that’s the best way to present the content. Nobody wants to flip through dozens of pages to find an item. After all, there’s a reason department stores have different departments and your newspaper has different sections. Your app should make it easy for the user to zoom in on the content she desires.
Virtualization
Once you identify the right partitioning of your app’s data, you might still need to present a lot of items on the screen. As discussed in Chapter 2, XAML offers two ways to handle lists or grids with a lot of items. To avoid repeating myself here, I will just talk about grids, but keep in mind that these ideas apply to lists as well.
As you know, grids can be either virtualized or not virtualized. When a grid is virtualized, the XAML engine renders content on demand. When a grid is not virtualized, everything is rendered up front. Grids that are not virtualized can have great performance once everything is set up, but the time needed to do so can easily hurt the performance significantly. Nonvirtualized grids don’t scale well because of this. If your app has to build a large nonvirtualized grid as part of startup or page navigation, performance suffers. Consequently, the recommendation is to use nonvirtualized grids only for small and simple grids.
Virtualized grids are much better at handling large collections because most of the work can be deferred until needed and the underlying data structures can be recycled. Virtualization reduces the work and the storage needed to handle and display the collection, which improves the performance. Virtualized grids scale much better with the number of elements, and they should be your preferred choice for anything but the simplest grids.
However, virtualized grids pay for great startup performance by doing more work during the navigation of the data, so you need to pay attention to how much work is required to handle each element in the grid. XAML offers a couple of ways to customize how this work is handled.
As you saw in Chapter 2, virtualization requires that the ItemsPanel of the GridView support virtualization. In Windows 8, the default ItemsPanel was WrapGrid, which supports virtualization. If you change the ItemsPanel to VariableSizedWrapGrid, you disable virtualization for the GridView because this panel doesn’t support virtualization.
If your data source was grouped, you would use a VirtualizingStackPanel as your ItemsPanel for the GridView (or nothing at all because VirtualizingStackPanel was the default). The VirtualizingStackPanel virtualizes the content based on entire groups, which means that all groups that are completely or partially on the screen are fully rendered. Groups that are completely off the screen are not rendered. If panning brings these groups on the screen, they get rendered as expected.
Group-based virtualization works well if you have many smaller groups. Most of the groups will be off the screen at any given time, so the XAML engine doesn’t spend time rendering those. However, if you have few, large groups, the majority of any given group will be off the screen most of the time. In that case, XAML still spends a lot of time rendering items in the group that are not visible.
To work around this, Windows 8.1 introduced a new ItemsPanel called ItemsWrapGrid, which supports virtualization at the item level. This support allows XAML to virtualize within groups, allowing it to render only a part of any group as necessary. Furthermore, in Windows 8.1, ItemsWrapGrid replaces VirtualizingStackPanel as the default ItemsPanel for grouped grid views. This means that if you have a Windows 8 app that uses the default ItemsPanel, all you have to do is retarget your app for Windows 8.1 and your app will automatically use the improved ItemsWrapGrid. However, if you customized the ItemsPanel to explicitly use another panel (or explicitly specified the use of VirtualizingStackPanel), you need to update this to use ItemsWrapGrid if you want your app to use the improved version.
To illustrate the performance differences between the different kinds of virtualizations, let me walk you through an example. Figure 3-3 shows a simple grid app with a grouped data source. The app displays 900 abstract pictures along with a title and a short description for each picture. The pictures are all local assets, so network response times don’t affect the performance in this case. The pictures are grouped into five groups of 180 pictures each.
FIGURE 3-3 Grid app using a grouped data source.
The XAML corresponding to the preceding app is shown in Listing 3-8 next.
LISTING 3-8 XAML for grid page.
<Page x:Class="GridVirtualization.MainPage" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:local="using:GridVirtualization" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d"> <Page.Resources> <CollectionViewSource x:Key="PictureSource" IsSourceGrouped="True" Source="{Binding}"/> </Page.Resources> <Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}" Margin="40"> <GridView ItemsSource="{Binding Source={Binding PictureSource}}" SelectionMode="None" > <GridView.ItemTemplate> <DataTemplate> <Grid Width="188" Height="125" Margin="10"> <Grid.RowDefinitions> <RowDefinition Height="auto" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <Image Grid.Column="0" Source="{Binding Location}" /> <StackPanel Grid.Column="1" Margin="10,0,10,0"> <TextBlock Text="{Binding Title}" FontSize="20" /> <TextBlock Text="{Binding Tagline}" /> </StackPanel> </Grid> </DataTemplate> </GridView.ItemTemplate> <GridView.ItemsPanel> <ItemsPanelTemplate> <VirtualizingStackPanel Orientation="Horizontal" /> </ItemsPanelTemplate> </GridView.ItemsPanel> <GridView.GroupStyle> <GroupStyle> <GroupStyle.HeaderTemplate> <DataTemplate> <TextBlock Style="{StaticResource SubheaderTextBlockStyle}" Text="{Binding Key}" Margin="10"/> </DataTemplate> </GroupStyle.HeaderTemplate> <GroupStyle.Panel> <ItemsPanelTemplate> <VariableSizedWrapGrid/> </ItemsPanelTemplate> </GroupStyle.Panel> </GroupStyle> </GridView.GroupStyle> </GridView> </Grid> </Page>
The XAML shouldn’t contain any big surprises. This is similar to how the Grid App template would arrange the markup for a Windows 8 app. The XAML sets up a grouped data source and binds that to a GridView. The interesting part is the GridView.ItemsPanel section. This section defines the ItemsPanelTemplate for the grid elements. In this case, I explicitly specify a horizontal VirtualizingStackPanel as highlighted in the code. Because this panel doesn’t support virtualization on the item level, performance suffers because of the large groups in the grid. On my Surface 2, displaying the grid takes around 2.5 seconds with another 2.5 seconds before the entire set of visible images are rendered. That’s 5 seconds in total—far beyond acceptable.
These numbers obviously depend on the size of the total grid as well as the size of each group. In this case, each group contains 180 elements. With 24 elements shown on the screen on a Surface 2, the bulk of any group is off the screen at any given point. Because VirtualizingStackPanel causes virtualization to happen per group, XAML has to do a lot of work to render the off-screen elements for the group or groups being displayed.
If I change the markup to use the new ItemsWrapGrid instead, the grid virtualizes on a per-item basis, which improves the performance drastically in this case. I measured less than 1.5 seconds for the grid to appear with all visible items rendered on my Surface 2. At this point, the app is responsive to the user. That’s certainly within the desired target values, and an example of how you can make a huge difference just picking the proper ItemsPanel. Further optimizations might be possible, but obviously picking the proper panel was important. Of course, you don’t actually have to specify the ItemsPanel explicitly, because the default is ItemsWrapGrid if you target Windows 8.1. However, if you do specify an ItemsPanel, make sure you pick the right one for the task.
Of course, if you use a non-virtualizing ItemsPanel such as VariableSizedWrapGrid instead, performance suffers as XAML has to render the entire grid. In this case, I measured around 8 seconds to render the grid on a Surface 2 when using a non-virtualizing ItemsPanel.
Placeholders
In Windows 8, a common problem with large, virtualized grids was what we call panning to black. Recall that when a grid or a list is virtualized, XAML has to create and render elements as they are panned into view. Creating and rendering each of these items take time. If the items are too complex, too numerous, or both, XAML might not be able to keep up and render these as quickly as needed, which means that nothing is displayed for these elements until XAML is able to catch up. Because many apps use a black background, the result of this problem is usually a large black area, as illustrated in Figure 3-4, and hence the term panning to black.
FIGURE 3-4 Panning to black—a common problem with large, virtualized grids in Windows 8.
The app in Figure 3-4 is based on the abstract pictures example mentioned earlier, but as you can see I made the formatting of each item a little more complex. In the new layout, there’s a border around each picture and a drop shadow made from placing a rectangle underneath each picture, and the text is now placed on a rectangle on top of the picture itself. Furthermore, I reduced the size of each picture element to fit more items on the screen. The more items and the more complex they are, the more likely it is that XAML might not be able to keep up during fast panning on slower devices. In this case, I have enough items on the screen to cause rendering problems on low-end devices.
Fortunately, this issue has been addressed in Windows 8.1 as well with the ShowsScrollingPlaceholders property on ListViewBase. Because both ListView and GridView specialize ListViewBase, this addition applies to both lists and grids.
When ShowsScrollingPlaceholders is enabled, XAML shows gray placeholder items in place of any items that it cannot render during panning as illustrated in Figure 3-5. This behavior immediately lets the user know that more items are being rendered. This makes for a much better user experience than simply panning through a sea of black. With the placeholders, the user is no longer in doubt about whether more content is coming.
FIGURE 3-5 When you enable ShowsScrollingPlaceholders, XAML renders gray boxes in place of content that has not been rendered.
ShowsScrollingPlaceholders is enabled by default, so you don’t need to do anything to take advantage of this feature if you’re building for Windows 8.1, but you might actually want to disable it in some cases. Why, you ask. Well, let me tell you about another feature in Windows 8.1.
Customized placeholders
Placeholders let the user know that additional items are on their way, but they are generic and don’t provide any specific information on what’s coming—after all, they are just gray boxes. If the XAML engine cannot keep up during panning, it is typically because each element being rendered is complex. By default, XAML attempts to render each visible item in full, and failing to do so, it renders the placeholder instead until it has the bandwidth to render the missing items.
But what if XAML could render the simple parts of each element before moving on to the more complex parts? The overall time to render the element wouldn’t improve, but instead of waiting for the entire element to render, the user would see some useful information for each element during panning and eventually all of each element on the screen.
The ContainerContentChanging event defined on ListViewBase allows just that. If you attach an event handler to this event, XAML calls your handler as it renders each element. This behavior allows you to render a simple version of the element initially. This could be something like a title or a brief description. From the handler, you can set up another handler to be called during the next phase of rendering the item. You can chain these handlers as you like and thus partition the work needed to render each element as you like. This is essentially an improved version of the placeholders feature, so if you implement this, you want to disable placeholders to avoid rendering both the generic placeholders and your own improved placeholders.
Let me walk you through an example. The app in Figure 3-6 is the same as you saw earlier for the example on placeholders. In the following, I walk you through how you can change this app to use the ContainerContentChanging event to provide custom placeholders instead of generic placeholders. The XAML for the app is listed in Listing 3-9.
FIGURE 3-6 This grid app shows a large number of somewhat complex elements and thus might not be able to provide a smooth panning experience on low-end devices.
LISTING 3-9 XAML for the abstract pictures viewer app.
<Page x:Class="GridVirtualization.MainPage" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:local="using:GridVirtualization" xmlns:d=http://schemas.microsoft.com/expression/blend/2008 xmlns:mc=http://schemas.openxmlformats.org/markup-compatibility/2006 mc:Ignorable="d"> <Page.Resources> <CollectionViewSource x:Key="PictureSource" IsSourceGrouped="True" Source="{Binding}"/> </Page.Resources> <Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}" Margin="40"> <GridView ItemsSource="{Binding Source={Binding PictureSource}}" SelectionMode="None" ShowsScrollingPlaceholders="True" > <GridView.ItemTemplate> <DataTemplate> <Border BorderBrush="Black" BorderThickness="1" Background="DarkGray"> <Grid Width="97" Height="65" Margin="5"> <Grid.RowDefinitions> <RowDefinition Height="auto" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <Rectangle x:Name="dropShadow" Fill="Black" Opacity="0.8" Width="94" Height="62" Margin="5,5,0,0"/> <Image Grid.Column="0" Source="{Binding Location}" VerticalAlignment="Top" HorizontalAlignment="Left" Width="94" Height="62"/> <Rectangle x:Name="textBackground" Fill="Black" Opacity="0.75" Height="35" VerticalAlignment="Bottom" Margin="0,0,3,3"/> <StackPanel Grid.Column="1" Margin="3,0,0,6" VerticalAlignment="Bottom" > <TextBlock Text="{Binding Title}" FontSize="12" /> <TextBlock Text="{Binding Tagline}" FontSize="10" /> </StackPanel> </Grid> </Border> </DataTemplate> </GridView.ItemTemplate> <GridView.ItemsPanel> <ItemsPanelTemplate> <ItemsWrapGrid /> </ItemsPanelTemplate> </GridView.ItemsPanel> <GridView.GroupStyle> <GroupStyle> <GroupStyle.HeaderTemplate> <DataTemplate> <TextBlock Style="{StaticResource SubheaderTextBlockStyle}" Text="{Binding Key}" Margin="10"/> </DataTemplate> </GroupStyle.HeaderTemplate> <GroupStyle.Panel> <ItemsPanelTemplate> <VariableSizedWrapGrid/> </ItemsPanelTemplate> </GroupStyle.Panel> </GroupStyle> </GridView.GroupStyle> </GridView> </Grid> </Page>
The markup is similar to what you saw in Listing 3-8, but I added some additional presentation elements and changed the size of each picture listing. Notice the use of the ShowsScrollingPlaceholders in the GridView tag. This is explicitly set to true, which isn’t strictly necessary because that’s the default. In other words, you could delete this attribute and achieve the same effect. However, because part of this exercise is to disable the generic placeholders, I included the attribute so that you can see where it goes.
To change this from using generic placeholders to using ContainerContentChanging, I need to do a couple of things. First and foremost, I need to hook up the event handler in the GridView tag. Second, I need to name the element for each abstract picture so that I can reference it in the event handler. The last thing I need to do in the markup is turn the generic placeholders off so that XAML doesn’t spend time rendering those as well as my custom placeholders. The updated XAML is shown in Listing 3-10.
LISTING 3-10 XAML updated to use ContainerContentChanging instead of ShowsScrollingPlaceholders.
<Page x:Class="GridVirtualization.MainPage" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:local="using:GridVirtualization" xmlns:d=http://schemas.microsoft.com/expression/blend/2008 xmlns:mc=http://schemas.openxmlformats.org/markup-compatibility/2006 mc:Ignorable="d"> <Page.Resources> <CollectionViewSource x:Key="PictureSource" IsSourceGrouped="True" Source="{Binding}"/> </Page.Resources> <Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}" Margin="40"> <GridView ItemsSource="{Binding Source={Binding PictureSource}}" SelectionMode="None" ShowsScrollingPlaceholders="False" ContainerContentChanging="ContainerContentChanging" > <GridView.ItemTemplate> <DataTemplate> <Border BorderBrush="Black" BorderThickness="1" Background="DarkGray"> <Grid Width="97" Height="65" Margin="5"> <Grid.RowDefinitions> <RowDefinition Height="auto" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <Rectangle x:Name="dropShadow" Fill="Black" Opacity="0.8" Width="94" Height="62" Margin="5,5,0,0"/> <Image x:Name="picture" Grid.Column="0" Source="{Binding Location}" VerticalAlignment="Top" HorizontalAlignment="Left" Width="94" Height="62"/> <Rectangle x:Name="textBackground" Fill="Black" Opacity="0.75" Height="35" VerticalAlignment="Bottom" Margin="0,0,3,3"/> <StackPanel Grid.Column="1" Margin="3,0,0,6" VerticalAlignment="Bottom" > <TextBlock x:Name="titleText" Text="{Binding Title}" FontSize="12" /> <TextBlock x:Name="descText" Text="{Binding Tagline}" FontSize="10" /> </StackPanel> </Grid> </Border> </DataTemplate> </GridView.ItemTemplate> <GridView.ItemsPanel> <ItemsPanelTemplate> <ItemsWrapGrid /> </ItemsPanelTemplate> </GridView.ItemsPanel> <GridView.GroupStyle> <GroupStyle> <GroupStyle.HeaderTemplate> <DataTemplate> <TextBlock Style="{StaticResource SubheaderTextBlockStyle}" Text="{Binding Key}" Margin="10"/> </DataTemplate> </GroupStyle.HeaderTemplate> <GroupStyle.Panel> <ItemsPanelTemplate> <VariableSizedWrapGrid/> </ItemsPanelTemplate> </GroupStyle.Panel> </GroupStyle> </GridView.GroupStyle> </GridView> </Grid> </Page>
The updated XAML markup hooks up an event handler for the ContainerContentChanging event, so obviously we need to implement that.
Before I go through the implementation though, let me stress that this is called for each element in the grid as part of rendering. This affects the time XAML spends creating display frames—recall from Chapter 2, that XAML breaks down the layout of the user interface in a number of frames. If creating the frames requires a lot of work, XAML will not be able to keep up with user interactions and responsiveness will suffer.
Take a look at the implementation of the initial event handler in Listing 3-11. It doesn’t do a lot, which is intentional because this method is called synchronously for each element. Whatever complex work you need to do to render the elements should be done at later stages, as I’ll discuss momentarily. Initially, ContainerContentChanging informs the XAML engine that the current element has been handled. This is an important optimization that tells XAML that it can skip its usual work for this element.
It then digs out all the references for the UI elements it needs to access and sets the Opacity of those to 0. This instructs the XAML engine to skip those elements during rendering because they are not visible. The elements are still part of the visual tree, however. The border element is left unchanged, so it still renders for each element. In other words, the border is the first visible piece in my custom placeholder. Finally, the handler sets up an additional callback for this element, signaling that there’s more work to do before rendering is complete. Unlike the first handler, subsequent callbacks are invoked asynchronously when XAML has the bandwidth to render additional details.
LISTING 3-11 Implementation of ContainerContentChanging.
private void ContainerContentChanging(ListViewBase sender, ContainerContentChangingEventArgs args) { // For improved performance, set Handled to true // so the app does not set content on this item args.Handled = true; var templateRoot = (Border)args.ItemContainer.ContentTemplateRoot; var textBg = (Rectangle)templateRoot.FindName("textBackground"); var title = (TextBlock)templateRoot.FindName("titleText"); var desc = (TextBlock)templateRoot.FindName("descText"); var dropShadow = (Rectangle)templateRoot.FindName("dropShadow"); var image = (Image)templateRoot.FindName("picture"); textBg.Opacity = 0; title.Opacity = 0; desc.Opacity = 0; dropShadow.Opacity = 0; image.Opacity = 0; args.RegisterUpdateCallback(ShowText); }
You can chain as many callbacks as you like, but if you need to chain more than a few, your elements might be too complex to be displayed efficiently in a grid or a list. This sample sets up a chain of three event handlers. The next handler displays the title and the description for the picture. The final handler displays the image and the drop shadow. The two remaining callbacks are shown in Listing 3-12.
LISTING 3-12 Remaining callbacks for the movies app.
private void ShowText(ListViewBase sender, ContainerContentChangingEventArgs args) { var picture = (AbstractPicture)args.Item; var templateRoot = (Border)args.ItemContainer.ContentTemplateRoot; var textBg = (Rectangle)templateRoot.FindName("textBackground"); var title = (TextBlock)templateRoot.FindName("titleText"); var desc = (TextBlock)templateRoot.FindName("descText"); title.Text = picture.Title; desc.Text = picture.Description; textBg.Opacity = 1; title.Opacity = 1; desc.Opacity = 1; args.RegisterUpdateCallback(ShowPicture); } private void ShowPicture(ListViewBase sender, ContainerContentChangingEventArgs args) { var picture = (AbstractPicture)args.Item; var templateRoot = (Border)args.ItemContainer.ContentTemplateRoot; var dropShadow = (Rectangle)templateRoot.FindName("dropShadow"); var image = (Image)templateRoot.FindName("picture"); image.Source = picture.Location; dropShadow.Opacity = 1; image.Opacity = 1; }
They both follow the same general idea. Each callback surfaces additional detail about the item being handled. Notice how the actual data item is passed in the Item property of args. In ShowText, this is used to retrieve the title and description of the picture and expose those. When data is ready, the corresponding UI elements are made visible by setting the Opacity property to 1, thus rendering the data during that phase.
ShowPicture doesn’t set up another callback, so it completes the chain of callbacks. When ShowPicture completes, the current item is rendered completely.
With the preceding implementation, each element renders as follows:
- Border
- Title and description on top of a rectangle
- Picture and drop shadow
ContainerContentChanging offers a great way to customize rendering for complex elements in grids and lists during panning. However, keep a couple of things in mind when using this approach. Splitting the rendering into multiple phases adds overhead to the work needed for each element, so you might need to try various approaches to get the balance right. The first method in the chain of methods is special. It must make sure to set the Handled property on the ContainerContentChangingEventArgs instance, and it should execute as quickly as possible because it is invoked synchronously for each element. The work you do in this method adds latency to the act of rendering the grid, so keep it as brief as possible.
Before you use ContainerContentChanging, you should simplify the layout for each element as much as possible. The goal is to make panning fast and smooth. The best way to do that is to reduce the amount of work XAML has to do for each element, reduce the number of elements, or both. Once you have done that, you can use ContainerContentChanging to partition the remaining work if needed.
Cache information to improve transitions
It doesn’t matter if your app does heavy calculations, performs complex queries, or downloads truckloads of data from the cloud—retrieving the necessary data can be time consuming. As discussed, you don’t want your app to wait for the data to become available because that leads to a bad user experience. Furthermore, when the data is finally present, you might want to hold on to it to avoid the overhead of calculating or retrieving the data again. Of course, holding on to data is not free either, so you need to come up with a good strategy for what you want your app to cache and how long data should be cached. Remember, a cache without an expiration policy is just a fancy word for a memory leak.
Getting caching right can be tricky. If you don’t get it right, you might cause your app’s memory usage to balloon, with no noticeable gain in performance. Before you implement caching, make sure you collect the data to support your decision. Once you implement your caching, you need to measure and adjust the implementation as necessary.
When it comes to caching, you’re generally looking for scenarios where getting the required data takes longer than desired and the bulk of the data is reused. If the data isn’t reused, caching it makes no sense regardless of how long it takes to retrieve it. If some of the data is reused, you need to consider the hit/miss ratio. If your app has to hold on to large amounts of data to facilitate caching, you might run into problems because of excessive memory usage, as described in Chapter 2. You might need to experiment with different approaches to get the balance right. As I said, getting caching right isn’t easy.
Once you identify scenarios that might benefit from caching, consider the following:
- Is caching already available? Anything your app retrieves over HTTP can be cached, and the protocol has its own scheme for controlling caching. To use HTTP caching, you need to be able to control how data is handled on the server. If that’s an option, HTTP caching is a good choice because it provides an easy-to-use, well-understood mechanism for caching resources, and best of all, it is completely transparent to the app.
- Can results be cached for multiple users? Some calculations and queries are common across many or all the user sessions. These can be executed and cached on the server instead of letting each client do the work. This can eliminate the work in the majority of cases if many scenarios use the same data. However, it obviously adds additional network latency to the equation. For this to be attractive, the cost of doing the work locally must exceed the latency.
- Does data retrieval follow a pattern? In some scenarios, there’s a good chance you can predict subsequent queries or calculations based on the current action. If that’s the case, you can use this to prepopulate your app cache. This works for both local caches and when interacting with a back end. For example, if your app uses a local cache, the current action can trigger a background task that proactively calculates the next expected result set and stores that. A CancellationToken can be used to cancel the action if the prediction turns out to be incorrect. For back-end-based solutions, the server can return the requested data as well as data likely to be returned for the next query or queries. Doing so reduces the number of network round trips between the app and the back end for each request.
- Is the data valid beyond the current user interaction? If the user closes and restarts the app, will it need the same data again? If the app needs the same data, the cache should be persisted locally and read upon restart. There are various options for persisting data. For data that is read in bulk, just serializing the objects is a good approach. If data is read back selectively, a local database such as SQLite might be a better option. Persisting and releasing the data can also reduce the memory usage of the app. This works well if different parts of the app use distinct sets of data.
Releasing memory on demand
The problem with holding on to too much data is that the memory usage of the app grows to undesirable levels. This affects overall system performance and increases the risk of the app being terminated by the operating system. In a .NET app, you don’t manage memory usage explicitly—instead, the Common Language Runtime (CLR) allocates and frees memory based on the lifetime of objects. As long as your app holds references to objects, they are not reclaimed during garbage collection and the associated memory segments are not released.
Weak references allow your app to release memory on demand. When your app holds a strong reference to an object or a graph of objects, these are considered in use and thus not reclaimed during garbage collection. A weak reference allows your app to hold on to an object or a collection of objects while still permitting the objects to be collected during garbage collection. By using weak references, your app can hold on to data and release it automatically if needed. Listing 3-13 shows how to use the WeakReference class to hold on to a list of items.
LISTING 3-13 Using a weak reference to hold on to a list.
// Instance field WeakReference<List<SomeType>> weakRef; ... // Some place in the code ... var list = GetListOfObjects(); // Create a weak reference to the list weakRef = new WeakReference<List<SomeType>>(list); ... // Somewhere else in the code ... List<SomeType> list = null; if(!weakRef.TryGetTarget(out list)) { // The list was reclaimed during GC, so we have to re-create it list = GetListOfObjects(); }
The code is straightforward. weakRef is an instance field on some class that uses a List<SomeType>. Initially, I create a list using a strong local reference. As long as that reference is alive, the list cannot be reclaimed. Next, I assign a WeakReference to the field on the current object that points to the same list. When the original strong reference is no longer valid, the weak reference holds on to the list, but it will not prevent the list from being reclaimed if a garbage collection needs to reclaim the associated memory.
Because the list might have been reclaimed, I need to check if the weakRef is still referencing it before I can access the list. This is done through the TryGetTarget method. If it returns false, it means that list could not be set and I need to re-create the list. If it returns true, the list reference is set through the out parameter. This makes the original list object accessible and prevents the list from being collected.
You can use weak references in your data model to hold on to data as needed. Weak references are useful for data that can be re-created quickly. For data retrieved from the network, weak references might not be the best solution because garbage collections trigger the need to download the data again. For such data, it is better to persist it locally. Once that’s done, you can use weak references to hold on to in-memory copies of the data.