Posts

Showing posts from 2013

Super fast Contains search on a list

If you need to do this, use HashSet<T> which is 100X faster!

Entity Framework Gotcha

If you want to migrate a lot of data which means running updates to existing rows, it is 100X faster to first change the objects then run a single save changes call, instead of doing row by row call. Also, when batching such operations say by 5000 rows, re-initialize the entities object each time, otherwise it will cache everything progressively getting slower over time.

Fastest Serializer for .NET and C#

I just ran a benchmark today. DataContractSerializer runs pretty fast, but The DataContractJsonSerializer can serialize faster. Note however that the DataContractSerializer can deserialize faster than the Json. Deserialization times: Binary: 198 ms DC: 20 ms Json: 74 ms Serialization times: Binary: 191 ms DC: 0.3 ms Json: 0.2 ms When the size of data increases, Binary slows down faster than other techniques. Also, the size of the data is significantly smaller for Json and DC.

Another way to do a not exists SQL query with best performance

This comes up again and again as a thorny and persistent problem. I got this new way of doing this from my friend. SELECT alias1. PKCol1 , alias1.Col2 FROM table1 alias1 WHERE alias1.Col3= 11 AND alias1.Col4= 43 AND       NOT EXISTS (SELECT 'x' FROM table2 alias2 WHERE alias2. PKCol1 = alias1. PKCol1 )

Azure App failing after upgrade to SDK 2.1?

There are many posts online without answers to this problem. No errors are seen anywhere - it may give RoleEnvironment errors or host not found errors. Basically to fix this, just go through all your projects and ensure that the dlls are referencing the correct azure version. In my case, Microsoft.WindowsAzure.ServiceRuntime.dll was referenced to v2.0 and it was giving the problem. Just change it to v2.1 for ALL projects.

Caching is not as easy as you think it is

For the past year I have been playing around with azure caching and learning a lot. I have got frustrated, confused and then finally there seems to be light at the end of the tunnel. This is what I've learnt from my experience so far: 1. I have no idea why they allow co-located caching because if you are having a decent sized cache with lots of reads and writes, this is unusable. Use dedicated cache role - only that can handle significant load. 2. Caching which uses BLOB storage as a persistence mechanism has problems if it accesses BLOB storage frequently because items are not remaining in the cache. I was flummoxed by this problem but worked around it because in my case, I was able to get away with checking for the existence of the BLOB which is much faster than trying to download it every time (my data just requires the existence check). 3. Don't be unhappy with your code if you have x-small and small instances and your sites with caching roles do not perform well. ...

c# async method without a real await supported method called within it.

Have been searching for this since a long time. Finally found it: http://stackoverflow.com/questions/15522900/how-to-safely-call-an-async-method-in-c-sharp-without-await public   async   static   Task < int > GetTwitterResultsCountAsync( string  itemLink)         {              int  returnCount =  await   Task .Run< int >(() =>             {                  int  count = 0;                  try                 {                 ...

There is more to the late 2008 macbook RAM upgrade than meets the eye

http://blog.macsales.com/9102-secret-firmware-lets-late-08-macbooks-use-8gb I read this and decided to upgrade the RAM on my macbook - late 2008 model -macbook 5,1. I first searched for the correct RAM online and saw this: http://eshop.macsales.com/shop/memory/Apple_MacBook_MacBook_Pro/Upgrade/DDR3 This is memory it recommends: 8.0GB  OWC Memory Upgrade 2 x 4.0GB PC8500 DDR3 1066MHz 204 Pin Now, this costs $89.88. I hate buying stuff online and I want to see it for myself, so I went to the local bestbuy which was having a $55 promo offer on the 8GB RAM kit with the following specs: PC3-10666 DDR3 RAM with both 1066 and 1333MHz support. I was not sure whether this was compatible, so I went to geek squad and double checked that it was ok. I opened my macbook and upgraded my RAM which was easy and saw huge amount of dust inside which I thoroughly cleaned... Then, I booted up and my macbook now runs as fast as a new macbook! I mean, this is 2X the speed it w...

Controlling costs in Windows Azure

One of the problems in dealing with any cloud platform is that we need to minimize the costs when nobody is using the service while when more users come in, it should scale as much as needed. When you have lot of money then it is easy to simply throw money at the problem so that even when nobody is using the service it runs say 2 small or 2 medium roles. Sometimes we can try to justify this saying that the application really needs 2 small roles just for something like caching. It all adds up quickly when the main app needs another 2 roles and a service is another 2 roles. I have gone from using 5 roles to just 3 roles at steady state. And I get better performance also. I am currently using 2 co-located roles which have caching + web. This makes the web role faster because the cache is in its own memory. This is faster than using dedicated roles and even trying local cache along with it. The slowest is shared caching where they provide the service. This is followed by dedicate...

You should always know how to debug process crashes using dump files

http://blogs.msdn.com/b/debugger/archive/2009/12/30/what-is-a-dump-and-how-do-i-create-one.aspx http://blogs.msdn.com/b/visualstudioalm/archive/2013/06/20/using-visual-studio-2013-to-diagnose-net-memory-issues-in-production.aspx

VS 2012 (with Update 3) screws up .cmd files containing batch scripts

This may not be specific to Update 3 or VS 2012. I noticed that if you edit a .cmd file containing batch commands in Visual Studio, it adds garbage to the first line of the file. The execution of the scripts will fail because of this. Hence use notepad for this.

Azure role keeps on crashing and recycling after upgrade to 2.0 sdk

I spent two days debugging and trying to fix this file load exception. Finally it turned out that out of the 50 projects I used several of them were still pointed to old 1.7 and 1.8 dlls. I fixed that and now everything works fine.

Horrible LINQ to Twitter - Finally found out how to updatestatus

_authorizer =  new   SingleUserAuthorizer                 {                     Credentials =  new   InMemoryCredentials                     {                         ConsumerKey =  TwitterConfiguration .ConsumerKey,                         ConsumerSecret =  TwitterConfiguration .ConsumerSecret,                      ...

Process - Thread - Task Nightmare

I found out one more thing today - a process, task and thread can all have different priorities all independent of each other. For a task, you cannot legitimately change its priority without causing issues and it is not recommended, for a thread, you want to reduce priority to prevent it from locking to and holding 100% CPU doing some work. However, the caveat with using threads is that it runs at a lower security context from a thread pool thread a.k.a task - hence, either change its security context internally OR us a task instead in such cases. Yes, even if the process itself runs as a user which has access (to say the IIS), the thread spinned out from it, may not be able to read IIS - we have an issue where the first time the service starts up, it does not have access, although the second time, after we restart it, it does have access. Weird.

Divide and Round to Next

public   static   int  DivideAndRound( int  x,  int  y)         {              return  ( int ) ( Math .Ceiling(( double )(( double )x / ( double )y)));         }

Everything you thought you knew about .NET Threading is wrong.

A few years ago, everybody used to should about XML and only use XML for everything. It was a good case of people rushing to something because of the hype and using it for scenarios it was not meant to be used. I have to say I fell for the TPL library hard. It is actually a piece of dirt which has been over hyped and sold as a solution to all threading issues. Let me make it clear here, the TPL fails miserably because as per the recommendations, we can't set the thread priority without screwing up its internal scheduler, and because we can't do that, if the task contains a for or while loop which loops fast enough the processor will use 100% CPU. That by itself is not a problem. The issue is because the task runs with normal priority, no other process can get access to the processor easily and things slow down considerably. Hence, I have tested and verified that if you know that you are going to use a limited number of threads, and you want to split up work which might ...

A class for parallelizing any For Loop in C#

From: foreach (Entity item in collection) {       DoWork( item); } To: LoopParallelizer .Execute< Entity >( TaskCreationOptions .PreferFairness,                                             collection,                                            new Action<Entity>( DoWork )); This will basically split any for loop into a configurable number of threads and run these threads in parallel and wait till all of them are over. The bigger the collection, the faster will be the performance gain by doing t...

How to get a usable hash code of an empty file for comparison purposes

This puzzled me for a while, but I have a solution now. Just create a hash from the file name because file contents are empty. In C#, you can use the below: public   static   byte [] CreateMD5Hash( List < string > dataList)         {              using  ( MemoryStream  fingerStream =  new   MemoryStream ())             {                  using  ( StreamWriter  writer =  new   StreamWriter (fingerStream))                 {                      foreach  ( string  data  in  dataLi...

Sending emails with attachments in .NET (Caveats)

This is quite easy to do, what is not so apparent though is that if you try to delete the attachment either in code or manually even after the mail is sent, it does not let you do it. This puzzled me for a while. Now, I realize - you need to dispose the MailMessage - if you don't then it hangs on to the file until your program restarts.

System.Threading.Timers.Timer (also) has issues

As many .NET developers know, we can't use System.Timers.Timer because it sometimes does not fire the events in time. Hence, we are recommended to use System.Threading.Timers.Timer. Yesterday I found one issue with this timer as well which I wanted to point out: http://msdn.microsoft.com/en-us/library/yz1c7148(v=vs.100).aspx If you pass infinite as the due time then on some servers, the timer just stops working and never fires again. I usually test with all windows updates (optional + important). On such a system this never happened. However, we did notice this happening on a test machine yesterday. It was a 2008 R2 VM. To get it working, I pass the same value as period to the due time. So far, this seems to be working. [Update] It does not work. I now stop the timer, reset the interval and start it again. Hopefully this works.

Visual Studio just killed my code changes (just after I tested it works perfectly) - how to get your code back.

Just decompile the .NET dll from where you were testing if you want to be 100% sure of the fact that you did not lose your changes.

How to reduce CPU usage when the code causes continous 80-100% CPU running normally.

In this example, I was trying to create hash codes for 14K files. The first thing to do is to try and use an algorithm which does not use that much CPU. In my case, that meant using a crc32 algorithm instead of the MS md5 class. But this just reduced the CPU for 100% to 80% - it still remained at 80% for half an hour or more. In such cases, we have to work around this. Because basically what is happening is that one by one if we generate the hash codes for so many files, it will consume that much CPU. The solution is to put Thread sleeps in between, and if you want like me, you can vary the sleep as per the size of the file, because the high CPU is caused not by the number of files alone, but by the size of the files. Put an adequate amount of sleep in between the hash code generation code so that, the CPU is able to do other work in between. You can experiment by changing the sleep and observing how the CPU goes up and down. I heard that using BeginInvoke instead of Task ba...

Calling a higher level method deep inside the lower levels of the code

I have encountered this problem many times in my career. Somewhere deep in the code where we have very little context, we need to use some parameters which are only available in the top layers of the code and then if you are really unlucky, you may even need to make a WCF call to another server. It is not very difficult to solve this problem. This is a scenario where you could "use a method like a variable". I.e., create a delegate which may take a parameter and may even return a value (if you need). Inside this method, you can write all the high level code you want. Func<string, string> MyAction = delegate (string var1) {       //Whatever you want to do, you can do here.       return "something"; }; - If you don't want to return anything, use Action<T> instead. - If you are within a for loop when you declare this delegate, do not use any list variables like list[i] inside this method. Set it to a real variable: var x = list[i] ...

Never use defaults for any program

We had an MS build problem, which we fixed by specifying parameters explicitly: msbuild "MySolution.sln" /t:Rebuild /property:WarningLevel=0 /p:Configuration=Release /p:AllowUnsafeBlocks=true /p:Platform="Any CPU"

Fixing a problem in a dll for which you do not have source code

When you somehow want to fix a business need and there is no way out, you may have to resort to this method. I have a few links here which will be of use in that case because I had to do this recently: http://social.msdn.microsoft.com/Forums/en-US/msbuild/thread/05b3cf5d-ead3-4274-88f5-6e8cbda8e8d8/ An intriguing thing I found out that looks like valid C# gets converted into IL and gets decompiled - but when you get the source code like this, it may not compile properly, because apparently, the C# compiler does not  like many things in this decompiled code which probably is optimized out of the IL. So, you need to look at each case and work around it. You may have to use keywords like unchecked, new, etc. In some cases, you may have to explicitly cast objects (see how the code is in the decompiler and then figure out that it needs to be cast to X or Y class). If you deal with this dll being a wrapper around C++ dlls which are called from within them, the following links wil...

Handling Time Zone differences when sending dates to javascript and reading back over AJAX

First convert to string: string dateValue = lastRow.PublishDate.Value.ToLongDateString();                 string timeValue = lastRow.PublishDate.Value.ToLongTimeString(); Pass to javascript like this: String.Format("javascript:fnDoWork(new Date('{0} {1}'));",                                                                                                            dateValue,                                                                                       ...

Caching gotcha in azure

If you access a web role from different URLs do not expect the role to be able to read the cache OR the session if it was set by calling a method from a different URL/ server name even if it points to the same thing. Weird. For session I can understand - can't figure out why it does not work for memory in a dedicated cache role though. [Update] I was wrong, not working with cache because there was a condition checking for session before setting the cache value. So cache value never got set. Makes sense now.

Skip running startup tasks on Azure Emulator

http://blog.smarx.com/posts/skipping-windows-azure-startup-tasks-when-running-in-the-emulator

How to work with MemoryStream and Azure Blob Storage

http://chopapp.com/#bvgehltw The thing to remember here is that before you read the memory stream to deserialize the object, move to position 0 (front of stream). The serializer code is here: http://chopapp.com/#g8nxwk6t

Configuring SMTP Server in Azure Role

http://community.adxstudio.com/products/adxstudio-portals/developers-guide/azure/dynamically-install-an-smtp-server-in-windows-azur/ The article is great, you just don't need to put in an end point on port 25 unless you want to make calls on that port to send emails. It works fine even without it, if your app living on the instance is making SMTP calls to send emails.

Gated check-ins in TFS are great!

I think any team with distributed team members and/ or lot of people need gated check-ins otherwise the build keeps breaking all the time. There is a problem I found which happened after the TFS Update 2. If you do a gated checkin for the same file several times a day, even though you are the only person working on the file and you are working from a single machine, it finds conflicts during a subsequent check-in and forces you to merge changes. Looks like a bug to me.

Faster way to check for blob existence

http://chopapp.com/#y5frcmsb The above will not work for private containers, hence I have updated the code: http://chopapp.com/#fkgs5j3u

Trying to update Azure deployment from the portal?

It really behaves weird. First it does not work, and gives weird error messages and the IE 10 + Azure new portal locks the package preventing visual studio from overwriting it. Error 98 Unable to remove directory "bin\Debug\app.publish\". The directory is not empty. C:\Windows\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets 4220 5 XYZ Took me sometime to figure out it was the browser locking the file. It kept telling me some setting is different in the upgrade vs what is already deployed. Anyway, I kept trying again and again, and Visual Studio finally deleted the deployment and rebuilt it again. I closed IE 10 and opened it again. I figured out that when I deploy from VS 2012, profiling is turned on. So, I turned it off. Now, it does update whatever role I want to update without complaining. You may need to do the same for example - updating all roles except the dedicated caching roles.

Creating not settable properties for API use

This is an easy way to do it, without using non default constructors and such: [ Serializable ]     [ DataContract ]      public   class   MyClass     {                          [ DataMember ]          public   int ? MyProperty1 {  get ;  internal   set ; }                  [ DataMember ]          public   int ? MyProperty2 {  get ;  internal   set ; }     } In previous articles, I have mentioned why you should always mark DataContract and DataMember attributes even though it is not explicitly required anymore. For your assembly where you set the pro...

ASP.NET page weirdly redirecting to home page or not loading at all for no reason

If you have read my previous post on pointing your custom domain name to an azure domain via CNAME record and how to fix the MX records for emails, we modified the @ entry for the custom domain from: @          xyz.cloudapp.net                             CNAME to @          http://www.customdomain.com    URL Redirect (301) www     xyz.cloudapp.net                              CNAME There is a problem with this. For some reason, if my links are http://customdomain.com/folder/1.aspx?id=123 It does not work. Neither over SSL nor over HTTP. To get this to work, I had to change my link to: http:// www. customdomain.com/folder/1.aspx?id=123 I am not really sure why this is necessary, and why neither Google Chrome or IE 10 is able to handle it. I'll keep ...

Getting an Azure Web Role to work with SSL, and then with ACS with real SSL certificate

It took me a week and some hosting provider change to get my Azure stuff working the way I wanted it to work. I am going to try and mention everything here so it is easy for the next person. Just note, whatever I have not mentioned, there are plenty of examples online, and they work, but you need the below thing in addition to those articles to really get it to work. 1. Discountasp.net sucks. Because the basic thing you need to get your cloud app working is a CNAME record from your domain name to the cloud provider domain (x.cloudapp.net for azure). These guys don't support it and their support people will give you really bad ideas which will break your site links and none of which will work. So, I am currently using NameCheap DNS where this is supported and whose Awesome support team went above and beyond to fix other issues which I will mention here. " At this point you have two choices: 1. You can use the solution provided below; or 2. You can use a third party...

Setting the trace level for windows azure logging

It can be confusing to determine how you can set your default trace logging level to in windows azure. You have to set it like this, so the system can understand that all the stuff you are writing to the log is actually going as log level 2, which is warning. Otherwise, if you use caching (say) too many log lines to make sense of it all. http://msdn.microsoft.com/en-us/library/system.diagnostics.tracelevel.aspx <system.diagnostics>     <switches>       <add name="logLevel" value="2" />     </switches>     <trace>       <listeners>         <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.8.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics">           <filter type="" />         </add>    ...

Redirect from https login page to http page loses session

This is because a round trip to the client is needed to save the cookie. To get around this, from the login page redirect to another https page, and from there redirect to an http page - then session is not lost.

Windows Azure: The certificate's private key could not be accessed.

<WebRole name="XYZ" vmsize="ExtraSmall" enableNativeCodeExecution="true">     <Runtime executionContext="elevated"></Runtime> The above is the fix for it. If you don;t want to do it, the below can also work: <Certificates>       <Certificate name="XYZ" permissionLevel="limitedOrElevated" storeLocation="LocalMachine" storeName="My" />     </Certificates>

Create certificate for azure web role

http://msdn.microsoft.com/en-us/library/gg432987.aspx

Read and Write to Azure like FTP Client

I have the entire code listing here: http://www.snipsave.com/user/profile/vijaymohan#4410 It will not compile by default.  But you can remove those lines and it takes of everything needed to host your files on the Azure cloud.

Comparing by value, and not by reference

This is a very good post: http://stackoverflow.com/questions/614713/datarow-comparison-not-working-as-expected

DateTimeParser for date strings with characters not supported in C# or .NET

This is an updated version of code I found online: http://chopapp.com/#tkn6endn I needed it to handle stuff like EST, PST, CST, MST and EDT. This can handle even more codes.

Using Windows Azure CDN to store static files for a website

I worked all night yesterday to get this working for my site. Turns out there are multiple issues you have to remember and work out for all of it to work properly. 1. Write a program which will upload your folder structure to BLOB storage. 2. This program should also set the proper content types for different types of files. For ex; if you do not set proper content type to a CSS file your sites CSS file will not load on any modern browser. This is why you will search high and low for cross domain CSS file in Azure CDN not rendering on browser. 3. There is some security setting which prevents cross domain CSS files from loading on a web site. This means you have to be able to go to the correct target page on the Azure CDN from a subdomain of your domain name. Ex: if you host the site on xyz.com, your CSS cannot come from abc.com, but it can come from abx.xyz.com 4. To do this you have to enable custom domain on the azure storage. http://www.windowsazure.com/en-us/develop/n...

C# to upload contents of a folder including sub folders to Windows Azure Blob Storage

This is useful if you want to upload stuff like what you would do with an FTP linked to an IIS site: using System; using System.IO; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.WindowsAzure; using Microsoft.WindowsAzure.StorageClient; using Microsoft.WindowsAzure.ServiceRuntime; namespace TestConsoleApp {     class Program     {         private static CloudBlobContainer _blobContainer = null;         private static CloudBlobClient _blobClient = null;         private static CloudStorageAccount _cloudStorageAccount = null;         static void Main(string[] args)         {             _cloudStorageAccount = CloudStorageAccount.Parse("KEY");             _blobClient = _cloudStorageAccount.CreateCloudBlobClient();         ...

Redirecting to Azure URL from DiscountASP.Net DNS

Their customer support took a long time to reply, made me go in circles and finally gave me a link, then I figured out what to do. " You'll need to use the URL Rewrite module to achieve your idea. You will create a rule within your web.config to redirect any traffic requests to "(www.)xyz.com" to be sent to xyz.cloudapp.net. Update your web.config file or connect via IIS manager to create this rule ( http://www.iis.net/learn/extensions/url-rewrite-module/using-the-url-rewrite-module ) " First you have to install an IIS extension which allows you to use your IIS manager (even Windows 8) to talk to their IIS. http://www.iis.net/downloads/microsoft/iis-manager It will download and execute an integrated installer which will install the extension. Then you connect to DiscountASP.net using the below instructions: https://support.discountasp.net/KB/a400/how-to-connect-to-windows-2008iis-7-using-microsoft.aspx Once you connect, it will ask you whether to ...

Quick AzureQueueManager

using Microsoft.WindowsAzure; using Microsoft.WindowsAzure.StorageClient; using Microsoft.WindowsAzure.ServiceRuntime; public enum QueueNameCategory     {         UnSpecified = 0,         A= 1,         B= 2,         C= 3,         D= 4,         E= 5     }     public class AzureQueueManager     {         private static Dictionary<QueueNameCategory, CloudQueue> _queueLookUp = new Dictionary<QueueNameCategory,CloudQueue>();                 private static CloudQueueClient _queueClient = null;         private static CloudStorageAccount _cloudStorageAccount = null;         private static bool _initialized = false;         static AzureQueueManager()         { ...

Right Click Create Unit Test Option gone in VS 2012

http://social.msdn.microsoft.com/Forums/en-US/vssetup/thread/f76aafa9-3ea8-4ff3-940d-0dcf3bb70273 Really sad to know this.

Upgrading to Windows Server 2012 Hyper-V

I recently had the oppurtunity to do this. Things to note: 1. Export the VMs and before exporting specify memory as static. 2. Upgrade takes a long time and reboots several times. 3. Before importing the VMs go to the Virtual Switch manager and configure an external network. 4. Importing the VMs take a long time. 5. Be sure to upgrade the integration services on all the VMs.

Download VS 2012 Remote Debugger

http://www.microsoft.com/en-us/download/details.aspx?id=30674 This is most useful. The link does not show up right away in Google at present.

Circular references when there really isn't any?

This link saved me a week of frustration today: http://nobodylikesasmartass.wordpress.com/2009/06/04/visual-studio-circular-dependency-nonsense/ Hail Google!

Fixes to touch support in Javascript

I have been using the code from the below URL to provide swipe events from javascript: http://padilicious.com/code/touchevents/ For some reason, it has stopped working for iOS. I had to painfully fix the code which determines the swipe direction: function determineSwipeDirection() {     if ((swipeAngle <= 46) && (swipeAngle >= 0)) {         swipeDirection = 'left';         //alert(swipeAngle + "-left");     } else if ((swipeAngle <= 360) && (swipeAngle >= 352)) {         swipeDirection = 'left';         //alert(swipeAngle + "-left");     } else if ((swipeAngle >= 158) && (swipeAngle <= 213)) {         swipeDirection = 'right';         //alert(swipeAngle + "-right");     } else if ((swipeAngle >= 45) && (swipeAngle <= 135)) {         swi...