Disabling Change Tracking in Entity Framework

This might come handy in case, you are having performance issue with Entity Framework and Change Tracking Option.

Short version :

using (SchoolEntities context = new SchoolEntities())
{
 context.Course.MergeOption = MergeOption.NoTracking;

 foreach (var course in context.Course)
 {
 // the courses we enumerate are detached state
 Console.WriteLine("{0} {1}", course.CourseID, course.Title);
 }
}
using (SchoolEntities context = new SchoolEntities())
{ 
 var query = new ObjectQuery<Course>(
 "SELECT VALUE c FROM SchoolEntities.Course AS c", 
 context, 
 MergeOption.NoTracking);

 foreach (var course in query)
 {
 // the courses we enumerate are detached state
 Console.WriteLine("{0} {1}", course.CourseID, course.Title);
 }
}

Thanks to Gil Fink’s for this post.

Read the whole article from : http://blogs.microsoft.co.il/blogs/gilf/archive/2009/02/20/disabling-change-tracking-in-entity-framework.aspx

 

Configuring IIS7 and ELB for HTTPS using wildcard common name

I have recently been asked to add support for HTTPS on one of our web applications. With little to no experience on how to perform this kind of operation, my first thought was, oh, shouldn’t be too hard, take the certificate and plug it in the web server and let’s go for a drink… not really what happened. I’ll try to give the maximum details with the references to help anyone in this process.

Let’s start by the environment I am running:

  • Couple of IIS7 Web Servers
  • Amazon Elastic Load Balancer (ELB)
  • DigiCert Certificate (any trusted provider should work)
  • Wildcard domain (*.mydomain.com)

To create your certificate you will need to send a CSR to your provider.

For IIS 7 Follow the instruction below:

  1. Click Start, then Administrative Tools, then Internet Information Services (IIS) Manager.
  2. Click on the server name.
  3. From the center menu, double-click the “Server Certificates” button in the “Security” section (it is near the bottom of the menu).
  4. Next, from the “Actions” menu (on the right), click on “Create Certificate Request.” This will open the Request Certificate wizard.
  5. In the “Distinguished Name Properties” window, enter the information as follows:
    Common Name – The name through which the certificate will be accessed (usually the fully-qualified domain name, e.g., www.domain.com or mail.domain.com).
    Organization – The legally registered name of your organization/company.
    Organizational unit – The name of your department within the organization (frequently this entry will be listed as “IT,” “Web Security,” or is simply left blank).
    City/locality – The city in which your organization is located.
    State/province – The state in which your organization is located.
    Country/region – If needed, you can find your two-digit country code at http://www.digicert.com/ssl-certificate-country-codes.htm.
  6. Click Next.
  7. In the “Cryptographic Service Provider Properties” window, leave both settings at their defaults (Microsoft RSA SChannel and 2048) and then click next.
  8. Enter a filename for your CSR file.
    Remember the filename that you choose and the location to which you save it. You will need to open this file as a text file and copy the entire body of it (including the Begin and End Certificate Request tags) into the online order process when prompted.

The full tutorial with screenshot and video is located at http://www.digicert.com/csr-creation-microsoft-iis-7.htm

You provider should then generate the SSL certificate and you should be able to download them in appropriate format for IIS. I would recommend using individual .crts in zipped format. You would find bunch of files and the one that will be of interest is the star_mydomain_com.crt. You can now complete your request on IIS by following steps 1 to 3 from the CSR request instructions and then the forth step would be to use the “Complete Certificate Request” from the option pane.

Complete Certificate Request

One important thing to note here is that if you have a wildcard domain the Friendly Name used to complete the CSR should be exactly the same as your common name : *.mydomain.com (this might not seem very important, but you will see the importance afterwards) If you already imported the certificate, you can use the Certificate Services MMC snap-in to change the friendly name of the certificate, find more on this page : http://technet.microsoft.com/en-us/library/cc753195%28v=ws.10%29.aspx

Once done, you can duplicate your certificate for use on your other IIS boxes if any and repeat the same steps.

Let’s jump to the configuration of our Elastic Load Balancer (Amazon ELB) to support HTTPS and import our certificate there. There are several technique and ways to do that. I will use the one that has been working in my case.

Find the basic reference on Amazon Web services ELB documentation http://docs.amazonwebservices.com/ElasticLoadBalancing/latest/DeveloperGuide/UserScenarios.html

Amazon Elastic Load Balancer HTTPS Setup

You will need to fill in the following fields:

Certificate Name:* – Any meaningful name to recognized your certificates afterwards

Private Key:* – RSA PRIVATE KEY

Public Key Certificate:* – Public key

This is where it becomes a bit touchy; you don’t have this information from your basic CSR or the certificate you need to use OpenSSL to generate them.

First you need to SSL Certificate / Export your private key, try these steps:

  1. Start the Microsoft Management Console  > Run mmc.exe
  2. Click the ‘Console’ menu and then click ‘Add/Remove Snap-in’.
  3. Click the ‘Add’ button and then choose the ‘certificates’ snap-in and click on ‘Add’.
  4. Select ‘Certificates’ and click ‘Add’.
  5. Select ‘Computer Account’ then click ‘Next’.
  6. Select ‘Local Computer’ and then click ‘OK’.
  7. Click ‘Close’ and then click ‘OK’.
  8. Expand the menu for ‘Certificates’ and click on the ‘Personal’ folder.
  9. Right click on the certificate that you want to export and select ‘All tasks’ > ‘Export’.
  10. A wizard will appear. Make sure you check the box to include the private key and continue through with this wizard until you have a .PFX file.

You can find the illustrated tutorial on this page: http://nl.globalsign.com/en/support/ssl+certificates/microsoft/all+windows+servers/export+private+key+or+certificate/
You need to have OpenSSL on your machine for the next step http://www.openssl.org/

Export the private key file from the pfx file

openssl pkcs12 -in filename.pfx -nocerts -out key.pem

Export the certificate file from the pfx file

openssl pkcs12 -in filename.pfx -clcerts -nokeys -out cert.pem

Remove the passphrase from the private key

openssl rsa -in key.pem -out server.key

Use the data in the server.key file for the Private Key:* required by ELB and you will now need to download a PEM version of your SSL certificate from your provide which will include all keys. Then take the first part of
—–BEGIN CERTIFICATE—–
Lots of fancy HEX here!!!
—–END CERTIFICATE—–

Copy and paste it in the Public Key Certificate:* and Amazon ELB should accept your certificate.

Update #1 for configuration of ELB:

There is also one more thing that would avoid configuration of ELB, you can just set the TCP port 443 as inbound and outbound and your web server will act as the SSL termination. This should work straight forward.

HTTP Logs Analysis using Microsoft Log Parser

While there are several tools freely available on the web to analyze your website traffic and they are doing great at this (Google AnalyticsGoogle Webmaster ToolBing Webmaster tool …). These tools provide great and free value to track your traffic and troubleshoot potential issues on your website. As any tool available they have some limitations and the need to find alternative/complementary solutions becomes necessary.

In this post I will discuss the use of Microsoft Log Parser to analyze “hits” on your web server  Any website of different size or complexity comes to have these different types of problems with time:

1)    Change of URL
2)    Removing old pages
3)    Error pages

To some extend the tools mention above will show you these errors, but they might not be exactly what you seek in a real data analysis perspective. Let’s take for example Error pages, some of your pages crashes sending HTTP 500 Status Code, you might not be able to recover data using the normal Google Analytics Javascript depending of how you are treating these crashes.

One way to get access to these data is to analyze you web server logs (if they are active of course). So as not to get too detailed in the explanation find below some utility code that will help you troubleshoot issues in your application. (After installing Log Parser you will be able to run the below syntax from command line)

HTTP 200 OK from Google Bots
[SQL]
LogParser.exe “SELECT date, count(*) as hit INTO HTTP200.jpg FROM Path\to\Logs\*.log WHERE cs(User-Agent) LIKE ‘%%google%%’ AND sc-status = ‘200’ GROUP BY date ORDER BY date” -i:w3c -groupSize:800×600 -chartType:Area -categories:ON -legend:OFF -fileType:JPG -chartTitle:”HTTP 200 Hits”
[/SQL]

HTTP 301 Permantly Moved Google Bots
[SQL]
LogParser.exe “SELECT date, count(*) as hit INTO HTTP301.jpg FROM Path\to\Logs\*.log WHERE cs(User-Agent) LIKE ‘%%google%%’ AND sc-status = ‘301’ GROUP BY date ORDER BY date” -i:w3c -groupSize:800×600 -chartType:Area -categories:ON -legend:OFF -fileType:JPG -chartTitle:”HTTP 301 Hits”
[/SQL]

HTTP 4xx Not Found / Gone Google Bots
[SQL]
LogParser.exe “SELECT date, count(*) as hit INTO HTTP4xx.jpg FROM Path\to\Logs\*.log WHERE cs(User-Agent) LIKE ‘%%google%%’ AND sc-status >= 400 AND sc-status < 500 GROUP BY date ORDER BY date” -i:w3c -groupSize:800×600 -chartType:Area -categories:ON -legend:OFF -fileType:JPG -chartTitle:”HTTP 4xx Hits”
[/SQL]

These queries will produce nice graphs of how much HTTP 200,301,4xx hits you receive per day while the Google bot is crawling you site.

You can also easily find out the same thing for your users by changing the cs(User-Agent) LIKE ‘%%google%%’ to cs(User-Agent) NOT LIKE ‘%%bot%%’.

Of course these are approximated to a certain level, because not all bots add the keyword “bot” to use user-agent.

Hoping this can come in handy. If you have more queries to share, drop by and put a comment.
Further readings :

http://blogs.iis.net/carlosag/archive/2010/03/25/analyze-your-iis-log-files-favorite-log-parser-queries.aspx

http://logparserplus.com/

Using windows hosts file

Windows hosts file, located under “[SystemDriveLetter]:\Windows\System32\drivers\etc” is very useful when you have to test your web applications hosted either locally or on a remote server and you do not wish to map them to your DNS.

Let’s take an example where you have a website named : http://www.my-simple-web-application.com. You will most likely have 3-4 versions of the application dev, preprod, test, live (where live would be http://www.my-simple-web-application.com)

To facilitate testing you could come up with a standard way of addressing these environments :

http://dev.my-simple-web-application.com
http://preprod.my-simple-web-application.com
http://test.my-simple-web-application.com

Each of these sub-domains might point to the same or different servers. This is where the hosts file comes handy, you can configure something like :

127.0.0.1 dev.my-simple-web-application.com
127.0.0.1 preprod.my-simple-web-application.com
127.0.0.1 test.my-simple-web-application.com

In this example all IP addresses are local, you can change them as needed, beware that this configuration should be place on each desktop (development and test) that you want to use these sub-domains.

On another note, this configuration can also be achieve network wide if you have a configurable router where you can add global hosts.

There a number of other situations where hosts file can be helpful :
1) You are migrating your website to a new server, in this case you can specify you existing domain name in the hosts file and point it to the IP of the new server
2) You have multiple web servers hosting the same application and one of them is not working properly you can target the mischievous server and change your host file to point only this server.

CodeIgniter – Pagination SEO Issue

I have recently been working with a PHP MVC Framework called CodeIgniter on a complete web application solution.  I have been trying some major framework like CakePHP, Zen and Symphony which where all very powerful framework for MVC and RAD development, the only thing they lack was a bit more of flexibility like CodeIgniter propose. Anyway may not have taken enought time to get to know all of the specifics of the other Frameworks, but while benchmarking i got aquainted to CodeIgniter much faster.

Even though CodeIgniter is a very flexible framework, it’s very lightweight and some feature for Web application have not been taken into account, that in mind, the people behind EllisLab, Inc made sure that these small twigs were easily bypassed by allowing complete customization of their libraries.

Here is my original issue:

I have a item listing page with pagination activated and I wanted the first page to be the the root URL of the item page.
e.g. http://www.mysite.com/items

But what CodeIgniter Pagination Library generated for the first page was: http://www.mysite.com/result/1

That is pretty inconvenient for SEO, because the crawler will find two pages with the same content while crawling the pages.

Thus i modified the CI_Pagination library an created MY_Pagination.

First of all i have added a new variable called first_page_url as class variable in MY_Pagination class

[php]

class MY_Pagination extends CI_Pagination {

var $first_page_url        = ”; // The first page will have this URL

[/php]

I have changed the original Pagination Library First page rendering from

[php]

// Render the “First” link
if  ($this->cur_page > ($this->num_links + 1))
{
$output .= $this->first_tag_open.'<a href=”‘.$this->base_url.'”>’.$this->first_link.'</a>’.$this->first_tag_close;
}

[/php]

to

[php]

// Render the “First” link
if  ($this->cur_page > ($this->num_links + 1))
{
$output .= $this->first_tag_open.'<a href=”‘.$this->first_page_url == ” ? $this->base_url : $this->first_page_url.'”>’.$this->first_link.'</a>’.$this->first_tag_close;
}

[/php]

This way if during the initialization of the Pagination class the configuration setting first_page_url was passed it will be used instead of the base_url.

Some modification were also made to the pagination digit generation from

[php]

// Write the digit links
for ($loop = $start -1; $loop <= $end; $loop++)
{
$i = ($loop * $this->per_page) – $this->per_page;

if ($i >= 0)
{
if ($this->cur_page == $loop)
{
$output .= $this->cur_tag_open.$loop.$this->cur_tag_close; // Current page
}
else
{
$n = ($i == 0) ? ” : $i;
$output .= $this->num_tag_open.'<a href=”‘.$this->base_url.$n.'”>’.$loop.'</a>’.$this->num_tag_close;
}
}
}

[/php]

to

[php]

// Write the digit links
for ($loop = $start -1; $loop <= $end; $loop++)
{
$i = ($loop * $this->per_page) – $this->per_page;

if ($i >= 0)
{
if ($this->cur_page == $loop)
{
$output .= $this->cur_tag_open.$loop.$this->cur_tag_close; // Current page
}
else if($loop == 1 && $this->first_page_url != ”)
{
$output .= $this->num_tag_open.'<a href=”‘.$this->first_page_url.'”>’.$loop.'</a>’.$this->num_tag_close;
}
else
{
$n = ($i == 0) ? ” : $i;
$output .= $this->num_tag_open.'<a href=”‘.$this->base_url.$n.'”>’.$loop.'</a>’.$this->num_tag_close;
}
}
}

[/php]

which will make sure that the page numbered 1 takes has the first_page_url has href when  first_page_url is available.

The complete file can be found here: MY_Pagination

SEO: Bounce rate of a website

Why is my bounce rate so high ?

Definition: A bounce occurs when a person leaves your website after reaching your entry page. The above cases can be considered equally as bounces from your website.

1) Visitor enters your site and press back immediately (before or even after the page has loaded)

2) Visitor waits for the page to load stays on this page for some time and then press back or navigate on another site. ( In this case the visitor might have found the information and then chose to navigate elsewhere to either find some supplementary information. Or it could be that he/she might not have found it but just read some pieces to see what is there, a third case could be the persons did not like the: website, content or colors on the site and went away.)

Therefore there seems to be considerable number of aspects to take into consideration to get a more precise question about “Why is my bounce rate so high ?”. There isn’t any straight forward answer to this question, but there are many questions that can lead to possible solutions:

When you ask your questions about bounce rate here are the different questions that might come to your mind.

Why is my bounce rate so high ?

User Interface
Is my layout/presentation/design attractive to visitors ?
Does my pages load slowly ?
Do my page have appropriate ads ? Are these ads non-aggressive towards the user ?
Is your page browser friendly ? (Can be views at any resolution with any browser the same way)

Content

Does

Windows Live Writer

It’s been not nearly a year since I last used Windows Live Writer, there seems to have been a lot of enhancement and changes compared to the first version that i have tried up. Today, I’ll be giving a try to the beta version (Build 14.0.5025.904). The setup was quiet easy and straight forward, warning me that I must enable XML-RPC on my blog before I can continue.

The first thing I installed is the S3 Object plug-in, to be able to upload images directly to my Amazon S3 account and use them on this post.

This image comes from my S3 bucket.

There are many feature such as the Insert Map: Let’s try it out and point out Mauritius for example 🙂

Where i live.

Nice. I have also created a Marker (Push Pin). Which does not seem to appear, When you create a map on Windows Live Messenger, it’s not the actual map that appears but an image with a link to the actual map… 🙁

Some other comment: There is to a problem, when there are connection cuts. The application just hangs during an operation. It was the case for my first post. Hope that is will be happen.

Conclusion a very nice and elaborated tool for blogging. I’ll will be trying it for a fews days and will send some feedback on this post. Or create a new one if need be.

My previous article to Windows Live Writer can be found here

backup and share files using MozyHome and Dropbox

If you are wondering what are the choices you have to make share and backup your files, you might easily be able to find hundreds of alternative on the web today. Each one with their different features and technologies.

I came across 2 free tools that i use for my backup and file sharing:

MozyHome Remote Backup

 

MozyHome is a small application that runs on background and backups any folder that you configure it to. It will encrypt and then upload your files to your 2 GB free space. You have several tools to enable you to them restore any lost files. Nice for uploading document, project files. You have also the paid version with unlimited space.

Dropbox

 

Dropbox is a new tool, that just got public last week. It’s similar to Mozybackup with 2 GB free space and the posibility of backup your files. With some extra features such as file sharing and automatic sync. Personally I use it to share files only. Since your files need to be in the DropDox folder (Similar concept and LiveMSN sharing folder) to be able to share.

It also allows sharing of files and give you a nice public URL that you can easily send to the person you want to share your files with. The only lacking features i might see here, is that you can’t share an entire folder with files inside. i.e When you want to share several photos you can’t directly get a link to the folder. You must share it with someone that already has Dropbox. A feature that would enable a link to a zip version of an entire folder could be nice.