<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Working, Gaming, Life]]></title><description><![CDATA[I'm Paul Devenney; developer, architect and scrum master. I've been a consultant and contractor, everything from pre-sales to design and delivery of a product. In my spare time I'm an avid gamer]]></description><link>http://www.pauldevenney.com/</link><generator>Ghost 0.7</generator><lastBuildDate>Thu, 16 Apr 2026 12:38:22 GMT</lastBuildDate><atom:link href="http://www.pauldevenney.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Using export configuration files in Orchard Import / Export]]></title><description><![CDATA[<p>In the more recent versions of Orchard (I think it came in in 1.9) the export options available at Import/Export became much more powerful. One of the underused features (thanks to a lack of documentation) is "Upload a configuration file with recipe steps to execute" option.</p>

<p><img src="http://www.pauldevenney.com/content/images/fromblogger/importexport1.PNG" alt="importexport1.PNG">
With this</p>]]></description><link>http://www.pauldevenney.com/using_export_configuration_files_in_orchard_import__export/</link><guid isPermaLink="false">9cab783a-3b3d-446e-8fcd-c059b9cb4a39</guid><category><![CDATA[Work]]></category><category><![CDATA[Orchard]]></category><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Mon, 11 Jan 2016 15:22:00 GMT</pubDate><content:encoded><![CDATA[<p>In the more recent versions of Orchard (I think it came in in 1.9) the export options available at Import/Export became much more powerful. One of the underused features (thanks to a lack of documentation) is "Upload a configuration file with recipe steps to execute" option.</p>

<p><img src="http://www.pauldevenney.com/content/images/fromblogger/importexport1.PNG" alt="importexport1.PNG">
With this option selected, you can upload an xml file that defines which content types and other custom export features you wish to export with your site, allowing you a repeatable process, and avoiding manual mistakes.</p>

<p>Here is an example of such a file.</p>

<script src="https://gist.github.com/PaulDevenney/21b7db9e2f639000dde9.js"></script>

<p>Note that "PagewithPromotion" is a custom content type that has been added to the site in this example. Unfortunately currently, there is only "inclusive" syntax it appears for listing which content types to export. I'm going to raise an issue for adding "exclude" syntax so that it is easier to say "all content types except bob".</p>

<p>I've constructed this template by looking through the source code. The example works, but there are probably also options I've not spotted, so feel free to explore the code further!</p>]]></content:encoded></item><item><title><![CDATA[Serializing Enums to String in MVC6]]></title><description><![CDATA[<p>Often you will want to change the default serialization of your API responses, to ensure that enums are reflected as strings, not integers. Why? Well, which is more self documenting?</p>

<pre><code>{
    "Status": 1,
    "Message": "things are on fire"
}
</code></pre>

<p>or</p>

<pre><code>{
    "Status": "Error",
    "Message": "things are on fire"
}
</code></pre>

<p>Fortunately this is really easy</p>]]></description><link>http://www.pauldevenney.com/serializing_enums_to_string_in_mvc6/</link><guid isPermaLink="false">1646edf2-46b9-4485-b1c1-ec93dc16124e</guid><category><![CDATA[MVC]]></category><category><![CDATA[.NET]]></category><category><![CDATA[Work]]></category><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Thu, 07 Jan 2016 12:11:00 GMT</pubDate><content:encoded><![CDATA[<p>Often you will want to change the default serialization of your API responses, to ensure that enums are reflected as strings, not integers. Why? Well, which is more self documenting?</p>

<pre><code>{
    "Status": 1,
    "Message": "things are on fire"
}
</code></pre>

<p>or</p>

<pre><code>{
    "Status": "Error",
    "Message": "things are on fire"
}
</code></pre>

<p>Fortunately this is really easy in MVC6. The default code uses the ever populate JSON.NET, and it's options are easily exposed. Note that this code example is against RC1, so should actually be correct going forward (the syntax has changed many times across vnext). I'm also assuming you have created your site using the template application, rather than creating an empty ASP.NET application.</p>

<p>Inside the "ConfigureServices" method find <code>services.AddMvc();</code> and replace it with</p>

<pre><code>services.AddMvc().AddJsonOptions(options =&gt;  
{
&amp;nbsp; &amp;nbsp; options.SerializerSettings.Converters.Add(new StringEnumConverter());
}); 
</code></pre>

<p>As you can see, you should be able to get at all the other json.net options from here too.</p>]]></content:encoded></item><item><title><![CDATA[Lamda expressions in unit testing are hard]]></title><description><![CDATA[<p>So, it turns out that Lamda expressions are not great for unit testing. A recent example of code that I wanted to unit test:</p>

<pre style="background: #f0f0f0; border: 1px dashed #CCCCCC; color: black; font-family: arial; font-size: 12px; height: auto; line-height: 20px; overflow: auto; padding: 0px; text-align: left; width: 99%;"><code style="color: black; word-wrap: normal;"> public async Task RegisterIndexAsync&lt;T&gt;(IFoundocIndex&lt;T&gt; index, CancellationToken cancellationToken)  
     {  
       //.....  
       await _fdbStorageProvider.ReadWriteAsync(async transaction =&gt;  
       {  
         var indexDefinitionState = await _indexProvider.GetIndexDefinitionStateFromStore(transaction,</code></pre>]]></description><link>http://www.pauldevenney.com/lamda_expressions_in_unit_testing_are_hard/</link><guid isPermaLink="false">624d56cb-ed1a-4fdc-94ae-61633affb5ea</guid><category><![CDATA[Work]]></category><category><![CDATA[.NET]]></category><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Tue, 23 Sep 2014 12:54:00 GMT</pubDate><content:encoded><![CDATA[<p>So, it turns out that Lamda expressions are not great for unit testing. A recent example of code that I wanted to unit test:</p>

<pre style="background: #f0f0f0; border: 1px dashed #CCCCCC; color: black; font-family: arial; font-size: 12px; height: auto; line-height: 20px; overflow: auto; padding: 0px; text-align: left; width: 99%;"><code style="color: black; word-wrap: normal;"> public async Task RegisterIndexAsync&lt;T&gt;(IFoundocIndex&lt;T&gt; index, CancellationToken cancellationToken)  
     {  
       //.....  
       await _fdbStorageProvider.ReadWriteAsync(async transaction =&gt;  
       {  
         var indexDefinitionState = await _indexProvider.GetIndexDefinitionStateFromStore(transaction, index).ConfigureAwait(false);  
         if (!indexDefinitionState.Exists || indexDefinitionState.Changed)  
         {  
           await _indexProvider.PersistIndexToStore(transaction, index).ConfigureAwait(false);  
           var documentCount = await _documentProvider.Count&lt;T&gt;(transaction).ConfigureAwait(false);  
           if (documentCount &gt; 0)  
           {  
             Trace.WriteLine(documentCount + " items found in collection");  
             rebuildIndex = true;  
           }  
         }  
       }).ConfigureAwait(false);  
       if (rebuildIndex)  
       {  
         Trace.WriteLine("Rebuilding Index: " + index.Name);  
         await RebuildIndexAsync(index, cancellationToken).ConfigureAwait(false);  
       }  
       Trace.WriteLine(String.Format("Not Rebuilding Index: {0}.", index.Name));  
     }  
     public async Task RebuildIndexAsync&lt;T&gt;(IFoundocIndex&lt;T&gt; index, CancellationToken cancellationToken)  
     {  
       using (var queue = new BlockingCollection&lt;IEnumerable&lt;T&gt;&gt;(_settings.MaxBatchesInIndexQueue))  
       {  
         _batchEntityProvider.GetBatches(queue, _settings.MaxIndexBatchSize).ConfigureAwait(false);//deliberately not awaiting this  
         await _batchConsumer.Consume(queue, cancellationToken, async batch =&gt; await ConsumeWorkQueue(batch));  
       }  
     }  
</code></pre>  

<p>Now of course, I'm unit testing an implementation call to&nbsp;<span style="background-color: #f0f0f0; font-family: arial; font-size: 12px; line-height: 20px;">RegisterIndexAsync&lt;T&gt;(IFoundocIndex&lt;T&gt; index, CancellationToken cancellationToken)</span> but I also want to verify that in this test, that my index was not rebuilt. Normally you could do this by mocking (for example, using the amazing <a href="https://github.com/Moq/moq4">Moq</a> and verifying the number of calls to<span style="background-color: #f0f0f0; font-family: arial; font-size: 12px; line-height: 20px;">_batchEntityProvider.GetBatches</span> but here there is a complication. </p>

<p>In this example you would need to use a Setup operation on&nbsp;<span style="background-color: #f0f0f0; font-family: arial; font-size: 12px; line-height: 20px;">_fdbStorageProvider.ReadWriteAsync</span>&nbsp;that would supply the entire of the lamda expression as its setup. Essentially - you would need to know and express the code for this function in your unit test setup. Your unit test becomes essentially "ensure that what the code does is what the code does" - and this is not right.</p>

<p>It is also extremely hard to do due to the way lamda expressions compile - the same resulting code will compile to a different object - so they are never going to the same object in your Moq setup call.</p>

<p>Looking more deeply into this example you could validly say that I shouldn't care about verifying whether an internal operation is called. All I should be worried about are external results right?</p>

<p>Quite possibly true in this case - if I cannot observe the impact through external interfaces its probably not worth knowing right? Except that I need to know if this method is rebuilding an index unnecessarily. If it does, there will be no observable difference - the index would be the same before and after, the only difference would be the time taken on larger indexes - something you can't identify in a unit test, and using time taken as a part of a test is a lousy idea anyway.</p>

<p>For this specific test, I think it is time to head back to integration tests, which leads to the future problem of - how do you do the equivalent of "Verify" when you are not mocking your classes. That's for another time.</p>]]></content:encoded></item><item><title><![CDATA[How to Setup Async and Task Return methods with Moq 4.2]]></title><description><![CDATA[<p>Moq 4.2 comes with a couple of nice changes that I hadn't noticed (and they are extension methods, so you might continue to miss them)</p>

<p>The main benefit is allowing you to change from writing  </p>

<div><pre style="background: #f0f0f0; border: 1px dashed #CCCCCC; color: black; font-family: arial; font-size: 12px; height: auto; line-height: 20px; overflow: auto; padding: 0px; text-align: left; width: 99%;"><code style="color: black; word-wrap: normal;"> _mock.Setup(m =&gt; m.GetStateAsync(It.IsAny&lt;Profile&gt;()))  
 .Returns(Task.FromResult(</code></pre></div>]]></description><link>http://www.pauldevenney.com/how_to_setup_async_and_task_return_methods_with_moq_4-2/</link><guid isPermaLink="false">40409dd1-3008-4ec4-b847-39972daf3c8f</guid><category><![CDATA[Work]]></category><category><![CDATA[.NET]]></category><category><![CDATA[Moq]]></category><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Thu, 18 Sep 2014 10:36:00 GMT</pubDate><content:encoded><![CDATA[<p>Moq 4.2 comes with a couple of nice changes that I hadn't noticed (and they are extension methods, so you might continue to miss them)</p>

<p>The main benefit is allowing you to change from writing  </p>

<div><pre style="background: #f0f0f0; border: 1px dashed #CCCCCC; color: black; font-family: arial; font-size: 12px; height: auto; line-height: 20px; overflow: auto; padding: 0px; text-align: left; width: 99%;"><code style="color: black; word-wrap: normal;"> _mock.Setup(m =&gt; m.GetStateAsync(It.IsAny&lt;Profile&gt;()))  
 .Returns(Task.FromResult(new IndexDefinitionState(true, true)));  
</code></pre>  
to writing

</div><div><pre style="background: #f0f0f0; border: 1px dashed #CCCCCC; color: black; font-family: arial; font-size: 12px; height: auto; line-height: 20px; overflow: auto; padding: 0px; text-align: left; width: 99%;"><code style="color: black; word-wrap: normal;">  _mock.Setup(m=&gt;m.GetStateFromStore(It.IsAny&lt;Profile&gt;()))  
    .ReturnsAsync(new IndexDefinitionState(true, true));  
</code></pre><div>  
</div><div>...which is just that little bit easier to manage (especially when it is a more complex return type than the example above), but it <i>also </i>allows methods with return type Task to work without further setup it seems. Both are extremely useful for the Async-first API I'm working on.</div><div>  
</div><div>From the release notes for Moq 4.2</div><ul><li>Improved support for async APIs by making default value a completed task</li><li>Added support for async Returns and Throws</li><li>Improved mock invocation sequence testing</li></ul><div>  
</div></div><div>All great stuff. I really couldn't do without Moq - a long time back it was the thing that made me realise that unit testing was actually viable.</div>]]></content:encoded></item><item><title><![CDATA[The battle for the internet is back on....]]></title><description><![CDATA[<p><a href="https://www.battleforthenet.com/">Battle for the Net</a>&nbsp;|&nbsp;<a href="https://www.youtube.com/watch?v=rz4Ej3IVefo">What is Net Neutrality</a></p>]]></description><link>http://www.pauldevenney.com/the_battle_for_the_internet_is_back_on----/</link><guid isPermaLink="false">69a3ca2c-6abe-489d-9ab8-6ab05436e024</guid><category><![CDATA[Life]]></category><category><![CDATA[Work]]></category><category><![CDATA[Gaming]]></category><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Wed, 10 Sep 2014 20:48:00 GMT</pubDate><content:encoded><![CDATA[<p><a href="https://www.battleforthenet.com/">Battle for the Net</a>&nbsp;|&nbsp;<a href="https://www.youtube.com/watch?v=rz4Ej3IVefo">What is Net Neutrality</a></p>]]></content:encoded></item><item><title><![CDATA[Thoughts on The Phoenix Project]]></title><description><![CDATA[<p>I finally got round to reading <a href="http://www.amazon.co.uk/The-Phoenix-Project-Helping-Business/dp/0988262592/ref=sr_1_1?ie=UTF8&amp;qid=1410292103&amp;sr=8-1&amp;keywords=the+phoenix+project">The Phoenix Project</a>&nbsp;last weekend. I know right? It's about time. I thought I'd share a few thoughts, as I think it's a great book and well worth a read for anyone in a business delivering products depending on IT (hint, nearly <i>every</i></p>]]></description><link>http://www.pauldevenney.com/thoughts_on_the_phoenix_project/</link><guid isPermaLink="false">0985cdad-3741-43c2-ae14-d0cf1900cf31</guid><category><![CDATA[Work]]></category><category><![CDATA[Books]]></category><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Tue, 09 Sep 2014 22:13:00 GMT</pubDate><content:encoded><![CDATA[<p>I finally got round to reading <a href="http://www.amazon.co.uk/The-Phoenix-Project-Helping-Business/dp/0988262592/ref=sr_1_1?ie=UTF8&amp;qid=1410292103&amp;sr=8-1&amp;keywords=the+phoenix+project">The Phoenix Project</a>&nbsp;last weekend. I know right? It's about time. I thought I'd share a few thoughts, as I think it's a great book and well worth a read for anyone in a business delivering products depending on IT (hint, nearly <i>every</i>&nbsp;business of any size these days depends on successful IT supporting the core business functions).</p>

<p>The Phoenix Project is told as a novel about the recovery story of a business who who used to be the best widget maker in the world, and are now being pounded on by a faster, more agile, hungrier up-and-comer. We follow Bill, who is promoted into the seventh circle of hell - VP of IT Operations, with only 90 days until the release of the-mother-of-all-projects which is vital to the companies survival. It's failing hard, VPs are falling left and right, every meeting looks like Game of Thrones and every piece of the puzzle seems to depend on one engineer, Brent, who's time is more oversubscribed than a year 2000 dotcom IPO.</p>

<p>Before I go any further - I cannot recommend enough the concept of getting your message through as a novel, rather than a text book. There are other books that take this similar approach, <a href="http://www.amazon.co.uk/The-Five-Dysfunctions-Team-Leadership/dp/0787960756/ref=sr_1_1?ie=UTF8&amp;qid=undefined&amp;sr=8-1&amp;keywords=the+five+dysfunctions+of+a+team">The Five Dysfunctions of a Team</a>&nbsp;is another one I love, and for the same reason. It turns dry, bullet point material, and turns it into a "what happens next" adventure. You are with the protagonist on whether they solve their problems, and your brain is following the same steps as they do all the way through. A fiction is worth a thousand bullet points you might say. Consider them "text books with only extremely coherent examples". I read The Phoenix project in two days. I can't remember the last time I read a Terry Pratchett so fast.</p>

<p>There are some key messages to take from it. I want to avoid explaining the entire story of the book, because half of it's power is you working the problem yourself.</p>

<p>The first is actually not one really focused on by the book explicitly in its "lessons", but is worth learning: While development, operations, sales, marketing, "products" etc are all sniping at each other and seeing all other departments as "getting in the way of real work", you are in a pretty bad spot. Maybe a more positive assertion is: <b>All teams need to work together with the vision that they are all responsible for delivering the companies core product</b></p>

<p>The second is more explicit: <b>Understand the definition of work and from that understand what things your team actually works on and prioritise it</b>. The book lists four, but you could easily make it two; p<i>lanned work</i>, and <i>unplanned work</i>. The book splits planned into business project, internal project (e.g. infrastructure upgrade) and changes (e.g. production db schema update). The obvious question is this - if you don't know all the work that your team are being asked to do, or where it comes from, how can you possibly ensure that you prioritise it correctly?</p>

<p>The next is a classic from lean, and is well developed in books such as The Toyota Way, and The Lean Startup. <b>Work in progress kills productivity</b>. In IT terms, anything that's started, but not working correctly in production is useless to the business. It is money spent and no return gained. When stock analysts look at companies they are interested in investing in, they look at how well they convert raised capital into further gains, and so should we as IT professionals. It ties in very closely to one of my own favourite mantras on "the definition of done", which I firmly believe can only be interpreted as <i>working bug free as desired by the client or business in the production environment. </i>Why such a stringent definition? Anything else can still come back as a task on your plate. You have to <i>context switch</i> (not multi-task, you actually can't do that you know), and internally prioritise. Putting something on hold either requires carefully "putting something back on the shelf", or even worse, simply dropping it without care or attention.</p>

<p>Possibly the most unique lesson in the book is: <b>Any attempt to optimise a process that doesn't improve the time of that processes' bottleneck is false progress</b>. This is gold dust, and if there is nothing else in this book for you, take that. When you read the book you see things in a new light (unless you've already read <a href="http://www.amazon.co.uk/The-Goal-Process-Ongoing-Improvement/dp/0566086654/ref=sr_1_1?ie=UTF8&amp;qid=1410295876&amp;sr=8-1&amp;keywords=The+goal">The Goal</a>). In any process - look at the bottleneck and refine that and that only, until it is no longer the bottleneck. Then find the new bottleneck. The books example is the amazing "Brent" - at the heart of every production incident, architectural planning exercise and key software project deliverable. If he gets hit by a bus, the company could literally fold. However, after several chapters of removing Brent from every coal face so that he can actually improve the overall process of the organisation comes the big reveal. Brent is not a "work centre" - he's only a person at the work centre. Like any manufacturing plant many parts of what we do <i>are</i>&nbsp;automatable - particularly in the deployment life cycle and testing. Which leads to...</p>

<p><b>Identify your work centres</b>: Identify all the parts of your software development cycle, they are your work centres. Now find the bottleneck among those, and improve that. Remember the golden rule above - if you are not improving the bottleneck of the process, then your effort will not reap the benefits you desire.</p>

<p>These last two are &nbsp;the current topics of my own contemplation, as we try to reduce our continuous deployment cycle down to where we know others have already reached. It finally gives me a strategy for approaching the problem that doesn't involve simply "refine all the things". It may be obvious to say "improve your worst bits first", but what you think are your "worst bits" might not actually turn out to be your bottlenecks, so improving them would be <i>false progress</i>.</p>

<p>There are lots of other excellent snippets in there (and probably some major points I glossed over, but hey, you are going to read it anyway, right?). Why ten minutes work might take several days to be acted upon for example, and these just add more and more interesting food for thought.</p>

<p>I'll certainly be re-reading The Phoenix project - probably any time I get stuck on how we improve next, but first up I'll be reading <a href="http://www.amazon.co.uk/The-Goal-Process-Ongoing-Improvement/dp/0566086654/ref=sr_1_1?ie=UTF8&amp;qid=1410295876&amp;sr=8-1&amp;keywords=The+goal">The Goal</a></p>

<p>Happy reading :)</p>]]></content:encoded></item><item><title><![CDATA[Could not copy the file manifest because it was not found]]></title><description><![CDATA[<p>This is an old chestnut that has maddened me for quite some time. It all starts with the error message</p>

<ul class="compiler-errors" id="compileErrorsData" style="background-color: white; color: #151515; font-family: 'helvetica neue', arial, sans-serif; font-size: 12.727272033691406px; line-height: 19.68000030517578px; list-style: none; margin: 5px 0px; padding-left: 0px;"><li class="compiler-error" style="color: #ed2c10; font-family: Menlo, 'Bitstream Vera Sans Mono', 'Courier New', Courier, monospace; font-size: 12px; padding-left: 27px;">C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets(4453,&nbsp;5):&nbsp;error&nbsp;MSB3030:&nbsp;Could&nbsp;not&nbsp;copy&nbsp;the&nbsp;file&nbsp;"bin\</li></ul>]]></description><link>http://www.pauldevenney.com/could_not_copy_the_file_manifest_because_it_was_not_found/</link><guid isPermaLink="false">c6e86277-5722-47e2-a382-96114152434e</guid><category><![CDATA[Work]]></category><category><![CDATA[.NET]]></category><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Tue, 22 Jul 2014 15:27:00 GMT</pubDate><content:encoded><![CDATA[<p>This is an old chestnut that has maddened me for quite some time. It all starts with the error message</p>

<ul class="compiler-errors" id="compileErrorsData" style="background-color: white; color: #151515; font-family: 'helvetica neue', arial, sans-serif; font-size: 12.727272033691406px; line-height: 19.68000030517578px; list-style: none; margin: 5px 0px; padding-left: 0px;"><li class="compiler-error" style="color: #ed2c10; font-family: Menlo, 'Bitstream Vera Sans Mono', 'Courier New', Courier, monospace; font-size: 12px; padding-left: 27px;">C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets(4453,&nbsp;5):&nbsp;error&nbsp;MSB3030:&nbsp;Could&nbsp;not&nbsp;copy&nbsp;the&nbsp;file&nbsp;"bin\Release\myapp.exe.manifest"&nbsp;because&nbsp;it&nbsp;was&nbsp;not&nbsp;found.&nbsp;</li><li class="compiler-error" style="color: #ed2c10; font-family: Menlo, 'Bitstream Vera Sans Mono', 'Courier New', Courier, monospace; font-size: 12px; padding-left: 27px;">Project&nbsp;src\mysolution\myapp\myapp.csproj&nbsp;failed.&nbsp;</li><li class="compiler-error" style="color: #ed2c10; font-family: Menlo, 'Bitstream Vera Sans Mono', 'Courier New', Courier, monospace; font-size: 12px; padding-left: 27px;">Project&nbsp;src\mysolution\mysolution.sln&nbsp;failed.&nbsp;</li></ul>  

<p>It almost certainly happens when you are doing the following:</p>

<ul><li>Using msbuild directly, rather than Visual Studio "right click" (e.g. you are using Team City)</li><li>Have a solution with a web application and some other application types (e.g. a console app)</li><li>Are using the Target "Publish" to publish the website</li></ul>  

<p>For example, here are the TC settings that cause the problem for me:</p>

<p><img src="http://www.pauldevenney.com/content/images/fromblogger/PublishSettings.PNG" alt="PublishSettings.PNG">
I'm also using a solution that has this structure</p>

<ul><li>MySolution</li><ul><li>MyConsoleApp</li><li>MyWebApp</li></ul></ul>  

<p>MyWebApp has a publish profile setup called "PublishToDisk". If I build or publish from Visual studio, everything is fine. If I build from Team City, I get the errors above.</p>

<p>I've known the <i>cause</i>&nbsp;of the problem for ages. The blanket "Publish" target is being applied to every project in the solution that might be publishable. That includes my console application. My console app is not setup for publishing however, nor do I want it to be. I do, however, want it to be built during the Team City process, as the app will get copied elsewhere via the packaging process in later stages.</p>

<p>However the possible solutions are not so obvious. running msbuild separately on myConsoleApp.csproj and MyWebApp.csproj seems ridiculously inefficient. Making my console application a click once app just to stop a build failure seems equally silly. By far the best solution I've found is to edit your myconsoleapp.csproj file and add the following section</p>

<pre class="csharpcode" style="background-color: white; font-family: Consolas, 'Courier New', Courier, monospace; font-size: small;"><span class="kwrd" style="color: blue;">&lt;</span><span class="html" style="color: maroon;">PropertyGroup</span><span class="kwrd" style="color: blue;">&gt;</span>  
    <span class="kwrd" style="color: blue;">&lt;</span><span class="html" style="color: maroon;">GenerateManifests</span><span class="kwrd" style="color: blue;">&gt;</span>true<span class="kwrd" style="color: blue;">&lt;/</span><span class="html" style="color: maroon;">GenerateManifests</span><span class="kwrd" style="color: blue;">&gt;
</span><span class="kwrd" style="color: blue;">&lt;/</span><span class="html" style="color: maroon;">PropertyGroup</span><span class="kwrd" style="color: blue;">&gt;</span></pre>  

<p>You can add this just before the ItemGroup for reference includes.</p>

<p>This should be enough to keep to allow your application to build without a load of very awkward feeling bodges.</p>

<p>Incidentally - this problem is one that occurs if you try to do this publishing process with <a href="http://www.orchardproject.net/">Orchard</a> via TC, so would solve that scenario too.</p>]]></content:encoded></item><item><title><![CDATA[Going Native is All Too Easy]]></title><description><![CDATA[<p>We tend to fall into the same traps over time, not matter how much we try. When I was a consultant, it was very easy to correct a client who demanded technical solutions instead of listing their business objectives. It was my job to re translate that back into "what</p>]]></description><link>http://www.pauldevenney.com/going_native_is_all_too_easy/</link><guid isPermaLink="false">8929b904-5567-43b4-a32f-b042d1f01917</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Tue, 15 Jul 2014 17:01:00 GMT</pubDate><content:encoded><![CDATA[<p>We tend to fall into the same traps over time, not matter how much we try. When I was a consultant, it was very easy to correct a client who demanded technical solutions instead of listing their business objectives. It was my job to re translate that back into "what is the real requirement here".</p>

<p>Now I've been working for an architect at a single company for two years, I find myself falling into the traps that I used to help others avoid. In designing a new profile service a group of three architects, plus other contributors discussed the proposed structure. We went over all the technical implications, and we felt we understood what the business need was quite well thank you, including what would need to be in an MVP, and what could be deferred to later.</p>

<p>Four months on and I find myself in a very enlightening meeting with key business users, who have a <i>far</i>&nbsp;better understanding of what they need than I did, despite having "done my research". I find that several of the decisions I made, while valid in their way, just didn't go far enough to addressing the business need.</p>

<p>It is a timely reminder that the developer (no matter how well in tune with the business he believes himself to be) is ultimately far more focused on a technical challenge and "elegant solutions" than he is with what the end consumers want.</p>

<p><b>Lesson Learned</b>: If you think you have a solution to the business problem, ask yourself "have I sat in a room for an hour with four business user who have no interest in <i>how</i>&nbsp;it is done, but only <i>what it lets them do</i>?"</p>]]></content:encoded></item><item><title><![CDATA[Pros and Cons: Comparing RavenDB and FoundationDB]]></title><description><![CDATA[<p>We've recently been evaluating options for storage for a new profiles micro-service. The original prototype for this was produced in Raven, but recently other teams within the business have been having some level of success with FoundationDB. While RavenDB is touted as a "Document Store", FoundationDB claims to be simply</p>]]></description><link>http://www.pauldevenney.com/pros_and_cons_comparing_ravendb_and_foundationdb/</link><guid isPermaLink="false">f5f6ff8f-53e4-4b4a-a215-90f531db21e0</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Mon, 14 Jul 2014 11:15:00 GMT</pubDate><content:encoded><![CDATA[<p>We've recently been evaluating options for storage for a new profiles micro-service. The original prototype for this was produced in Raven, but recently other teams within the business have been having some level of success with FoundationDB. While RavenDB is touted as a "Document Store", FoundationDB claims to be simply a "Key/Value pair store. Below is my assesment of the pros and cons of each. <br>
<div style="margin-bottom: 0pt; margin-top: 10pt;"></div></p><h3><span style="background-color: transparent; color: black; font-family: 'Trebuchet MS'; font-size: 17px; font-style: normal; font-variant: normal; font-weight: bold; line-height: 1.15; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">RavenDB</span></h3><hr><div><b>Pro</b></div><ul><li>Excellent .NET client API with extensibility points, providing easy developer learning curve</li><li>Designed as Document Store</li><li>Provides Index/Map/Reduce</li><li>Stores natively as JSON</li><li>Good Read Performance</li><li>Fits excellently into integration testing, due to in memory db option designed for testing</li><li>Automatically generates Ids for records</li><li>Well presented web based management studio</li><li>Raven Server has a good console where you can see requests/response times and what indexes are used to resolve queries</li></ul><h4><b>Con</b></h4><ul><li>Cannot test replication, sharding or authenticated access functions without purchasing licenses and licenses are needed for anything other than development (UAT would need licenses)</li><li>Some concerns over the dependence of the RavenDB project on one key developer</li><li>Some concerns about the robustness of the testing of the product and its unproven track record in enterprise solutions (posts <a href="http://blog.orangelightning.co.uk/2013/11/would-i-recommend-ravendb/">like this</a> are easy to find</li><li>Yet another product for devops to support</li><li>Need to understand the “eventually consistent” model well when designing solutions</li></ul><div> <br>
</div><h3> FoundationDB</h3><hr><h4>Pro</h4><ul><li>We already use it in several other services (we have experience of it)</li><li>Better licensing terms (by far) All features free outside production, and production licensing terms essentially means it is currently free for us to use</li><li>Full ACID compliance</li><li>Has both Consistency and Availability during Partitioning (assuming it isn’t a catastrophic failure)</li><li>Built in transaction retries</li><li>Support from FDB team is excellent</li><li>Excellent read performance both single reads and range reads are only marginally slower</li><li>Transaction isolation level is serializable</li><li>Simple to scale horizontally</li></ul><h4>Con</h4><ul><li>Designed as a Key/Value pair store rather than Document Store</li><li>Weak .NET support (only 3rd party .NET client wrapper on top of C) with less .NET documentation/support. NET not considered first class citizen of FoundationDB</li><li>Constraints on deployment - cannot be deployed in IIS, must be self-hosted - loss of IIS specific features such as graceful request handling, automatic app pool recycles/mem management</li><li>Cannot run in multiple AppDomains on same process</li><li>Like Raven, also an Alpha product, though we have less concerns about the composition of the development team</li><li>Does not generate IDs, so a separate “ID Generation Service” would be required (or switch entire platform to GUIDs with the resulting data migrations</li><li>No nice ‘management’ interface - you need to roll your own admin tool</li></ul><p></p>

<p>To summarise the differences at high level I would say that RavenDB is great for .NET developers to rapidly write applications against, but may provide a problematic operational experience, and paying before you can test replication/"clustering" successfully is a tough ask. Conversely, FoundationDB has a really steep curve to get going with .NET, you have to live without several normally expected comforts, but does provide a compelling operational case from the ACID/clustering point of view (though, it too is an alpha, but its testing thoroughness seems far better).</p>

<p>As for which is "better", well, it really depends on purpose. It looks like a case of weighing up "Fast, easy development" versus "fast easy ongoing operational support". As a business we are currently leaning towards the operational ease, as products tends to spend most of in production, not development (unless you are the UK government of course).</p>]]></content:encoded></item><item><title><![CDATA[It's great when your tools make you more effective...]]></title><description><![CDATA[<p><span style="background-color: white; color: #404040; font-family: Roboto, arial, sans-serif; font-size: 12.727272033691406px; line-height: 16.545454025268555px;">Today I wrote some <b>code</b>, which my <b>integration tests</b> showed a regression in. I fixed this and merged to&nbsp;<b>Git</b>. <b>Hub</b>. <b>Team City</b> automatically build and versioned a package, which <b>Octopus Deploy</b> deployed to dev01. I then ran my <b>load tests</b>, which showed things had slowed down. I checked</span></p>]]></description><link>http://www.pauldevenney.com/it_s_great_when_your_tools_make_you_more_effective---/</link><guid isPermaLink="false">b52a0de0-7a50-4c50-9a30-7dcfdc68cfc8</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Mon, 23 Jun 2014 15:43:00 GMT</pubDate><content:encoded><![CDATA[<p><span style="background-color: white; color: #404040; font-family: Roboto, arial, sans-serif; font-size: 12.727272033691406px; line-height: 16.545454025268555px;">Today I wrote some <b>code</b>, which my <b>integration tests</b> showed a regression in. I fixed this and merged to&nbsp;<b>Git</b>. <b>Hub</b>. <b>Team City</b> automatically build and versioned a package, which <b>Octopus Deploy</b> deployed to dev01. I then ran my <b>load tests</b>, which showed things had slowed down. I checked out the problem using <b>New Relic</b>, which showed traces for the slow transactions, along with <b>our correlation token</b> for the slow requests. I then used <b>Kibana </b>to find all log entries stored in <b>Elastic Search</b> with that correlation token, which highlighted the area causing the slow down.</span><br style="background-color: white; color: #404040; font-family: Roboto, arial, sans-serif; font-size: 12.727272033691406px; line-height: 16.545454025268555px;"><br style="background-color: white; color: #404040; font-family: Roboto, arial, sans-serif; font-size: 12.727272033691406px; line-height: 16.545454025268555px;"><span style="background-color: white; color: #404040; font-family: Roboto, arial, sans-serif; font-size: 12.727272033691406px; line-height: 16.545454025268555px;">I love it when a plan comes together....</span></p>]]></content:encoded></item><item><title><![CDATA[A simple Orchard module to inject a diagnostics shape into every page of your Orchard site]]></title><description><![CDATA[<p>Orchard has some great extensibility hooks. This post will show you how to very quickly use one to add a diagnostic section (like below) to the top of each page.</p>

<p><img src="http://www.pauldevenney.com/content/images/fromblogger/sessionchecker.PNG" alt="sessionchecker.PNG"></p>]]></description><link>http://www.pauldevenney.com/a_simple_orchard_module_to_inject_a_diagnostics_shape_into_every_page_of_your_orchard_site/</link><guid isPermaLink="false">9ae6b36c-1e1e-486f-a90c-95ca212a57ff</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Thu, 04 Jul 2013 19:00:00 GMT</pubDate><content:encoded><![CDATA[<p>Orchard has some great extensibility hooks. This post will show you how to very quickly use one to add a diagnostic section (like below) to the top of each page.</p>

<p><img src="http://www.pauldevenney.com/content/images/fromblogger/sessionchecker.PNG" alt="sessionchecker.PNG"></p>]]></content:encoded></item><item><title><![CDATA[Gotcha in Orchard CMS RoutesDescriptor when using Multi-Tenancy]]></title><description><![CDATA[<p>It's been a while since I posted, and I've been using Orchard CMS a lot among other things. I came across a rather trickysome problem today, hopefully this post will help others find the resolution quicker than  I did.  In orchard you can Implement IRouteProvider in any module you write.</p>]]></description><link>http://www.pauldevenney.com/gotcha_in_orchard_cms_routesdescriptor_when_using_multi_tenancy/</link><guid isPermaLink="false">07c20fc7-c25f-47b7-a469-a4b658452d30</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Thu, 06 Jun 2013 17:29:00 GMT</pubDate><content:encoded><![CDATA[<p>It's been a while since I posted, and I've been using Orchard CMS a lot among other things. I came across a rather trickysome problem today, hopefully this post will help others find the resolution quicker than  I did.  In orchard you can Implement IRouteProvider in any module you write. This is basically a wrapper for MVC routes, and works really well. For example, the route defined below is for custom handling in the event of errors.  </p>

<hr>

<p><pre><span style="color: blue;">public</span> <span style="color: blue;">class</span> ErrorHandlingRoutesProvider : IRouteProvider
    { 
        <span style="color: blue;">public</span> IEnumerable&lt;routedescriptor&gt; GetRoutes() 
        { 
            <span style="color: blue;">return</span> <span style="color: blue;">new</span>[]{ <br>
                <span style="color: blue;">new</span> RouteDescriptor{ 
                    Name = <span style="color: maroon;">"ErrorRoute"</span>, <br>
                    Priority = <span style="color: maroon;">1</span>, <br>
                    Route = <span style="color: blue;">new</span> Route( 
                        <span style="color: maroon;">"Error"</span>, 
                        <span style="color: blue;">new</span> RouteValueDictionary{ 
                            {<span style="color: maroon;">"action"</span>, <span style="color: maroon;">"ErrorPage"</span>}, 
                            {<span style="color: maroon;">"controller"</span>, <span style="color: maroon;">"ErrorHandler"</span>}, 
                            {<span style="color: maroon;">"area"</span>, <span style="color: maroon;">"BG.Shared.ErrorHandling"</span>} 
                        }, 
                        <span style="color: blue;">new</span> RouteValueDictionary(),<span style="color: green;">//constraints (none here)</span> 
                        <span style="color: blue;">new</span> RouteValueDictionary{ 
                            {<span style="color: maroon;">"area"</span>, <span style="color: maroon;">"BG.Shared.ErrorHandling"</span>} 
                        }, 
                        <span style="color: blue;">new</span> MvcRouteHandler()) 
                }; 
        }</pre><hr>This works as intended in a single site.  <b>However</b>, when you have <i>two</i> tenants using the <i>same module</i>, this will fail with an error similar to the following:  </p>

<hr>

<p><pre>System.ArgumentException: A route named 'ErrorRoute' <span style="color: blue;">is</span> already <span style="color: blue;">in</span> the route collection. Route names must be unique.
Parameter name: name 
   at System.Web.Routing.RouteCollection.Add(String name, RouteBase item) 
   at Orchard.Mvc.Routes.RoutePublisher.Publish(IEnumerable`<span style="color: maroon;">1</span> routes) <span style="color: blue;">in</span> d:\Workspaces\GitHub\src\Orchard\Mvc\Routes\RoutePublisher.cs:line <span style="color: maroon;">100</span>
   at Orchard.Environment.DefaultOrchardShell.Activate() <span style="color: blue;">in</span> d:\Workspaces\GitHub\src\Orchard\Environment\DefaultOrchardShell.cs:line <span style="color: maroon;">48</span>
   at Orchard.Environment.DefaultOrchardHost.ActivateShell(ShellContext context) <span style="color: blue;">in</span> d:\Workspaces\GitHub\src\Orchard\Environment\DefaultOrchardHost.cs:line <span style="color: maroon;">156</span>
   at Orchard.Environment.DefaultOrchardHost.CreateAndActivateShells() <span style="color: blue;">in</span> d:\Workspaces\GitHub\src\Orchard\Environment\DefaultOrchardHost.cs:line <span style="color: maroon;">135</span></pre><hr>The workaround is to comment out the "Name" attribute of the RouteDescriptor, as follows  </p>

<p><pre><span style="color: blue;">public</span> IEnumerable&lt;RouteDescriptor&gt; GetRoutes() 
        { 
            <span style="color: blue;">return</span> <span style="color: blue;">new</span>[]{ <br>
                <span style="color: blue;">new</span> RouteDescriptor{ </pre></p>

<pre><code>                &lt;span style="color: green;"&gt;//Name = "ErrorRoute", //This doesn't work in Multi-Tenancy&lt;/span&gt; 
                Priority = &lt;span style="color: maroon;"&gt;1&lt;/span&gt;,                     
                Route = &lt;span style="color: blue;"&gt;new&lt;/span&gt; Route( 
                    &lt;span style="color: maroon;"&gt;"Error"&lt;/span&gt;, 
                    &lt;span style="color: blue;"&gt;new&lt;/span&gt; RouteValueDictionary{ 
                        {&lt;span style="color: maroon;"&gt;"action"&lt;/span&gt;, &lt;span style="color: maroon;"&gt;"ErrorPage"&lt;/span&gt;}, 
                        {&lt;span style="color: maroon;"&gt;"controller"&lt;/span&gt;, &lt;span style="color: maroon;"&gt;"ErrorHandler"&lt;/span&gt;}, 
                        {&lt;span style="color: maroon;"&gt;"area"&lt;/span&gt;, &lt;span style="color: maroon;"&gt;"BG.Shared.ErrorHandling"&lt;/span&gt;} 
                    }, 
                    &lt;span style="color: blue;"&gt;new&lt;/span&gt; RouteValueDictionary(),&lt;span style="color: green;"&gt;//constraints (none here)&lt;/span&gt; 
                    &lt;span style="color: blue;"&gt;new&lt;/span&gt; RouteValueDictionary{ 
                        {&lt;span style="color: maroon;"&gt;"area"&lt;/span&gt;, &lt;span style="color: maroon;"&gt;"BG.Shared.ErrorHandling"&lt;/span&gt;} 
                    }, 
                    &lt;span style="color: blue;"&gt;new&lt;/span&gt; MvcRouteHandler()) 
            }      
        }; 
    }&lt;/pre&gt;I've started investigating this with the Orchard team, but the workaround doesn't really appear to have any drawbacks.
</code></pre>]]></content:encoded></item><item><title><![CDATA[Say NO to SOPA and PIPA]]></title><description><![CDATA[<p><a href="http://i.imgur.com/x1buX.png">Explanation of SOPA in cartoon</a> | <a href="http://en.wikipedia.org/wiki/Stop_Online_Piracy_Act">Wikipedia entry for SOPA</a></p>]]></description><link>http://www.pauldevenney.com/say_no_to_sopa_and_pipa/</link><guid isPermaLink="false">674d0e18-aff0-4b21-a181-58ee4dbffb8e</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Wed, 18 Jan 2012 19:56:00 GMT</pubDate><content:encoded><![CDATA[<p><a href="http://i.imgur.com/x1buX.png">Explanation of SOPA in cartoon</a> | <a href="http://en.wikipedia.org/wiki/Stop_Online_Piracy_Act">Wikipedia entry for SOPA</a></p>]]></content:encoded></item><item><title><![CDATA[Unreadable content was found in this item - PerformancePoint 2007 to 2010 Migration]]></title><description><![CDATA[<p>This little chestnut caused me no end of fun, and there is not a whole lot out there about it.</p>

<p></p><h2>The Problem:</h2>When you run the Import PerformancePoint 2007 Content wizard, using a valid account to connect to SQL server, and a valid BI Center as the target locations (which]]></description><link>http://www.pauldevenney.com/unreadable_content_was_found_in_this_item___performancepoint_2007_to_2010_migration/</link><guid isPermaLink="false">d6356e62-4e8a-41b8-97b1-b1b351e68d7e</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Tue, 19 Jul 2011 11:22:00 GMT</pubDate><content:encoded><![CDATA[<p>This little chestnut caused me no end of fun, and there is not a whole lot out there about it.</p>

<p></p><h2>The Problem:</h2>When you run the Import PerformancePoint 2007 Content wizard, using a valid account to connect to SQL server, and a valid BI Center as the target locations (which the wizard very kindly automatically identies and selects for you), you still receive the following message in each section.<p></p>

<p>"Unreadable content was found in this item".</p>

<p>There are two possibilities for this problem:</p>

<p><b>When it happens to all sections of the import (data sources, indicators, KPIs, Report views, score cards, dashboards):</b></p>

<p>The likely reason here is that the server needs to communicate with itself using a web service URL, and the (cursed) loopback adapter check is on. For this to be solved, the server must be able to access the target web application (e.g <a href="http://myintranet.company.com">http://myintranet.company.com</a>) from the local machine. This is easy to test. Open your browser from the server and see if you can. The following steps will resolve this issue:</p>

<ol><li><p>Remove the loopback adapter check using the following powershell:


<blockquote># Disable the Loopback Check  
#This setting usually kicks out a 401 error when you try to navigate to sites that resolve to a loopback address e.g. &nbsp;127.0.0.1
New-ItemProperty HKLM:\System\CurrentControlSet\Control\Lsa -Name "DisableLoopbackCheck" &nbsp;-value "1" -PropertyType dword</blockquote></p></li><li>  
<p>Ensure that a host entry exists for the site, either by DNS (for already in production systems), or, if you are in testing, or don't have access to DNS,  via adding the hosts entry to your hosts file at c:\windows\system32\drivers\etc\hosts, e.g.

127.0.0.1 myintranet.company.com</p></li></ol>  

<p><b>References:</b> <a href="http://social.technet.microsoft.com/Forums/en/sharepoint2010setup/thread/306c59e0-c74c-4f37-9df3-2b1202cef54e">http://social.technet.microsoft.com/Forums/en/sharepoint2010setup/thread/306c59e0-c74c-4f37-9df3-2b1202cef54e</a>, <a href="http://sptwentyten.wordpress.com/2010/03/06/disable-the-loopback-check-via-powershell/">http://sptwentyten.wordpress.com/2010/03/06/disable-the-loopback-check-via-powershell/</a></p>

<p><b>When it only happens to data sources and scorecards</b>  </p>

<p>My problem continued to persist past the first problem above. On further investigation of the SharePoint ULS logs, you should see messages similar to:

<blockquote>Failed to look up string with key "Section2TitleResource", keyfile osrvcore. edb1db92-2bd9-4dab-b772-b3b36b293e99</blockquote>  
<blockquote>Unreadable content was found in this item.  System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.     at System.ThrowHelper.ThrowKeyNotFoundException()     at System.Collections.Generic.Dictionary`2.get_Item(TKey key)     at Microsoft.SharePoint.Administration.SPScenarioContext.RetrieveDataFromSessionState(String key)     at Microsoft.PerformancePoint.ImportUtility.UI.WebPages.ScenarioPageBase.Page_Init(Object sender, EventArgs e) edb1db92-2bd9-4dab-b772-b3b36b293e99</blockquote></p><p>Now, this might lead you to think that something has not installed correctly, as it is complaining about resource files, but like nearly all problems in SharePoint, it comes back to permissions, and user context. The key issue in my case? I had created my own web service application pool to run the PerformancePoint service application under. The managed account for the application pool was a low privilege account (not the farm account for example).</p>  

<p>The key thing I had missed was, that I needed to run the following PowerShell to provide my service account with appropriate object access to run the wizard successfully.  
<blockquote>$w = Get-SPWebApplication("<a href="http://myintranet.company.com">http://myintranet.company.com</a>")  
$w.GrantAccessToProcessIdentity("dev\svc_PPServices")</blockquote></p>

<p>To be fair, Microsoft do list this in technet, but as with so many MS articles, they fail to tell you <i>why</i>, or how to recognise when you have not done this step.</p>  

<p>references: <a href="http://technet.microsoft.com/en-us/library/ee748643.aspx">http://technet.microsoft.com/en-us/library/ee748643.aspx</a></p>]]></content:encoded></item><item><title><![CDATA[SharePoint 2010 Capacity Guidelines updated for SP1]]></title><description><![CDATA[<p>Major changes seem to be more qualification on IOPS for large content databases, plus some more detailed understanding of the real limits of DB sizes in particular scenarios. Of particular interest is the idea that individual DBs could be up to 4TB in size (though why you would plan for</p>]]></description><link>http://www.pauldevenney.com/sharepoint_2010_capacity_guidelines_updated_for_sp1/</link><guid isPermaLink="false">4b46c0c6-279b-4774-b39d-43b0af87edc0</guid><dc:creator><![CDATA[Paul Devenney]]></dc:creator><pubDate>Mon, 11 Jul 2011 16:12:00 GMT</pubDate><content:encoded><![CDATA[<p>Major changes seem to be more qualification on IOPS for large content databases, plus some more detailed understanding of the real limits of DB sizes in particular scenarios. Of particular interest is the idea that individual DBs could be up to 4TB in size (though why you would plan for one 4TB database rather than a number of more&nbsp;manageable&nbsp;ones is a different question)</p>

<p><a href="http://sharepoint.microsoft.com/blog/Pages/BlogPost.aspx?pID=988">http://sharepoint.microsoft.com/blog/Pages/BlogPost.aspx?pID=988</a></p>]]></content:encoded></item></channel></rss>