Exception: Collection was modified; enumeration operation may not execute.

It’s been a while that i didn’t get this Exception, while working, and yesterday, while I was performing some test on a new module that I implemented i got this exception. I knew i did get it sometimes back last year, and managed to solve it. So here is one solution when you are dealing with this kind of issue:


IList<Product> productList = new List<Product>();
productList.Add(new Product(“Some product 1”);
productList.Add(new Product(“Some product 2”);
productList.Add(new Product(“Some product 3”);
productList.Add(new Product(“Some product 4”);



for(int i=(productList.Count -1); i >= 0; i–)
//Perform Edit, Update, Delete Operation using for .. i loop in reverse order


There is a list of other solutions that can be found on the following websites:

IT Workers gaining weigh and possible health problem

I’ve just found a nice article that talks of a study performed by CareerBuilder.com, it talks about how IT workers have begun to gain weigh just by sitting all day in front of their computers :). Let’s just hear what they have to say.

Here is a little intro:

Feeling a little, shall we say, sluggish lately? You might be among the vast ranks of IT workers who have put on some extra heft while sitting at their desks.

A study by CareerBuilder.com found that half of U.S. IT workers have gained weight at their current jobs.

The study, which polled nearly 7,700 participants from Feb. 11 through March 13, found that 34 percent of IT workers report they have gained more than 10 pounds in their current positions. Even more alarming, 17 percent say they have put on more than 20 pounds!…

Read the full article on: IT Workers Weigh In on Health Habits

Paging large result sets with SQL query

I’ve been searching on the internet for several ways to perform paging on a large result set from an SQL Query, there are several solutions, that can be found some are custom and others are already provided with ASP.NET Controls. This is the solution i have been using for performing paging of large datasets for data that is being used on a current web 2.0 site.

Here is a common problem:
A website offering search capabilities for it’s users is having performance issues while getting the result from it’s database, they have more than 100K record, and users are able to perform criteria search or just a text search on it. Criteria search can take up a lot of time to process depending on how it has been implemented. Statistics have shown that the average result count is about 1000 record for text search and 500 for criteria search.

The Stored Procedure
Let’s take this SQL Query as a sample query to get result for a typical text search.

SELECT Id, ProductName, ProductDescription, ProductRating, ProductPrice
FROM tbl_Product
ProductName LIKE '%' + @Query + '%'
ProductDescription LIKE '%' + @Query + '%'

This stored procedure takes one parameter a query ( @Query ), that the website visitor types in to search for something. And the query search both the Product Name and Product Description Field. This may result in some huge result set for some key terms.

We need to optimize this query to be able to perform custom paging on the other side.

here is how the optimized code would be:

--Will calculate the number of records of this particular request
SELECT @RecordCount = COUNT(1)
ProductName LIKE '%' + @Query + '%'
ProductDescription LIKE '%' + @Query + '%'

(SELECT ROW_NUMBER() OVER (ORDER BY Id ASC) AS Row, Id, ProductName, ProductDescription, ProductRating, ProductPrice
ProductName LIKE '%' + @Query + '%'
ProductDescription LIKE '%' + @Query + '%'
) AS TmpTbl
ORDER BY ProductRating

I have added 2 input parameters which are the @RowStart and @RownEnd, and 1 output parameter which is @RecordCount. With these new parameters added you will be able to lauch a query based on the number of records you want to display on a particular page. If you want to display 10 records per page your input parameters will have the following values
@RowStart = 1
@RowEnd = 10
since the query is inclusive.

You can use your prefered data control to display the data, and to build up the Paging logic in ASP.NET

Annoying software: a rogues’ gallery

Found this little article while reading some newsletter this morning:

Here is a little preview.

Adobe Reader
What does Adobe Reader do? Displays PDF pages. How does it do it? With as much bloody-minded bureaucracy, delay and needless interaction as possible. Perhaps it’s because we humans have been spoiled by books, where the gap between wanting to read something and reading it is as short as the time taken to lift the cover. But Reader’s incessant updates (demanding you reset your computer — why?), thundering great list of modules to load …

Oh, Apple. You created a domain where humans came first. You took usability and distilled it into an art form. Now look at you. iTunes is a music player the size of a fat-bottomed whale that gobbles resources like krill. It spends half its time trying to sell us stuff and the other half trying to stop us using it. But that’s not as bad as your auto-update policy: slipping us stealth copies of Safari under the cover of important version updates to iTunes and Quicktime….

Windows Update
Your machine will reset in four minutes. Your machine will not shut down until these five updates are installed. You must restart your machine now. You will install Microsoft Genuine Advantage. Please wait while these updates are installed. Please shut down all applications before applying this update. Pop! New updates are ready to be installed. And now that we’ve stopped you doing whatever it was you were doing (like we care)…

Read the entire article from zdnet: Annoying software: a rogues’ gallery

Padding is invalid and cannot be removed.

Stop Google crawling WebResource.axd & ScriptResource.axd
As an ASP.NET developer i often get error message: “Padding is invalid and cannot be removed”. It’s a pretty annoying message that i have been trying to get rid of for days. It was caused by Google trying to index, crawl my WebResource.axd and session. But when the session expires you get this error message. Since Google caches the pages it visits the session on this page has already expired after it is crawled, when it tries to crawl the page again and request the WebResource.axd or ScriptResource.axd with an old key, an exception is raised.

Therefore to solve this problem the simple solution for that is to modify your robots.txt file in your root directory

and add the following at the end

Disallow: /ScriptResource.axd

With this no more issues regarding invalid padding.

If this does not solve your issue you can take a look at what other users propose here.

Some update on this issue, i found out while googling some days back:

On this post you can find out how to compress the webresource.axd and also somewhere in between prevent this error to occur.

Use the tool found at the website below to generate a machine key and a decryption key:

Retrieving anchor value from URL

Lately i have been trying to get a grasp of the anchor value from the URL using ASP.NET. From what i learned from the forum (This might be wrong since there is a way in PHP(http://www.php.net/manual/en/function.parse-url.php) to get the anchor value.)

Case study:

e.g http://www.somesamplebigsite.com/product.aspx?q=123&u=09#description
The url above send the user to a page product and then your browser interpret the anchor and scroll down way to the description anchor. But the question is why the heck would i want to get the anchor since it’s just an anchor ? There might be several reasons for that. Here is 2 of them:

  • You might not want to pass an extra parameter in your URL query string because of Referencing issues
  • You would like to perform an action when the page reaches the anchor: Example open an AJAX popup or load some information or Execute a JavaScript in the Description Section of the product.

There might be several other reasons, but these two give you and idea of what is to be achieved.

You can’t get access to the anchor value using server side ASP.NET, it will neither be seen in the Request.Url nor the Request.RawUrl. Therefore those who are trying in vain to get access to this information, don’t bother anymore. You can just use a simple javascript that will do that, and then if you want afterward to either make a new submit to the server or do it in AJAX.

var anchorValue;
var url = document.location;
var strippedUrl = url.toString().split("#");
if(strippedUrl.length > 1)
anchorvalue = strippedUrl[1];

With these few lines of javascript you will be able to determine the value anchor that has been sent in your URL.

Goodbye Netscape Navigator

10 years have now gone since the first apparition of Netscape Navigator and now it is time to say goodbye to this browser. As announced by AOL, the netscape browser support will end on the 1st of February 2008.

I still remember during my first days using internet i have been playing around with Navigator 3 or 4 i may think. It was kind of the impressive i may say the first time, but afterwards i switched to Internet Explorer, and now i am using both Internet Explorer and FireFox.

Netscape Navigator 9

IBM OmniFind Yahoo! Edition 8.4.2

Quoted from OmniFind website.

IBM and Yahoo! make information actionable with a simple,
no-charge enterprise search solution.

I have been able to play a few month with the OmniFind Yahoo! Edition from IBM, it’s quiet some powerful tool offered for free compared to the other search engines that are offered on the net such as Google Custom Search Business, Google Mini, ISYS Web, Mondo Search, and some other tools that are listed here(http://www.searchtools.com).

Here are the features of OYE (OmniFind Yahoo! Edition)

Search internal web sites, local and remote file systems, and the public web

  • Search up to 500,000 documents and 5 separate collections
  • Support for 200+ file types
  • Search in 30+ languages
  • Translated into 15 languages View Screen Shot

Intuitive out-of-the-box user interface with advanced features

  • Based on Yahoo! Search UI View Screen Shot
  • Easily customized with graphical tools to configure look-and-feel, UI elements, branding View Screen Shot
  • Spell suggestion
  • Full wildcard support

Increased relevancy of results

  • Configurable synonyms and featured links View Screen Shot
  • Tunable relevancy controls View Screen Shot
  • Top queries, no results, and result click through reporting to enable fine-tuning View Screen Shot

Easy to deploy, configure, and maintain

  • No prerequisites
  • 3-click installer
  • 1-click startup
  • Go from the installer to searching in minutes

Open and extensible

  • Built on Apache Lucene
  • Open URL-based APIs (REST)
  • Define, populate and search your own custom fields
  • Easily embeddable and customizable UI output (XML/XSTL/HTML, HTML snippets)

From my point of view the only inconvenience with this tool is that is not open for any plug in or add in functionalities. That could really help make it much much better for users. But again, this tool is complete and contains all the needed functionalities for an Entry Level Enterprise Search Engine.

Here are some screen shot of the new version of the new version of OmniFind that i just upgrade from the 8.4.1
Home - Default Index

Meta Tags section - New to the 8.4.2

OmniFind Custom Search Page Designer

Lenovo 3000 N100

I just bought a Lenovo 3000 N100 6 month ago. Now it’s time for some review on this piece of hardware.

The hardware specification is as follows:

Model: N100 0768-FFG
CPU: Core 2 Duo T5600 1.83 Ghz
Memory: 1Gb DDR2 PC5300
Hard Drive: 120 GB HDD (Fujitsu) SATA
Screen: 15.4″ WSXGA+ 1680×1050 Glossy
Optical Drive: DVD-RW Matshima
GPU: NVIDIA 7300 Go 128 MB (dedicated)
Network/Wireless: Intel Wireless 3945A/B/G, Realtek 10/100 Ethernet Card, Modem and Bluetooth
Inputs: 84 Key Keyboard with Two Button Touchpad with Scroll Bar
Buttons: Power, Lenovo Care, Power Up and Down, Mute, and WiFi/Bluetooth On/Off Switch.

  • Four USB 2.0
  • Four-Pin Firewire
  • 4-in-1 Card Reader
  • Ethernet
  • Modem
  • VGA Out
  • S-Video Out
  • Microphone
  • Headphone
  • Security Lock
  • Power Connector

Integrated Camera (1.3 MegaPixel)
Fingerprint reader
6c Li-Ion

It was delivered with Windows Vista Business Edition. And some other software that makes it run slower than it should. Thus after the 1st month having Vista on it, i decided to downgrade back to Windows XP SP2 Pro. And this is just so fine. Beware before doing this operation make sure that you download any Driver that are needed. Since the set is not delivered with a driver CD. I found all the drivers on the Lenovo website.

Here are some stats and test that i did.

Using CPUz(http://www.cpuid.com/cpuz.php)





Super Pi Calculation (http://www.overclock.net/downloads/28044-definitive-super-pi-thread.html)

Will be coming soon… let me get some time.