OCDProgrammer.com

It's Microsoft's World, and I'm just living in it
View Clarence Klopfstein's profile on LinkedIn

Clarence Klopfstein's Facebook profile

This site is under construction...

Categories

New Comments

Referring Sites


Disclaimer

  • This is MY blog. The views represented here are not in relation to anybody else. Please read my full disclaimer for a more complete disclaimer.

Project In Review – Reducing ViewState

November 25, 2009 03:00 by ckincincy

In a .NET world ViewState can get out of hand in a hurry.  In some of our more intense pages the ViewState created by controls could reach well into the 50% range of the page size.  A brief search by a coworker found this solution.

The main thing to take from this article is to insert this snippet of code in your page, or custom base page:

    protected override PageStatePersister PageStatePersister
    {
        get
        {
            return new SessionPageStatePersister(this);
        }
    }

This little beauty of code can reduce the ViewState by 90+%, which is significant.

Now the obvious limitation to this is that your real ViewState is now stored in session, so this will add some overhead to your web application.


Project In Review

November 22, 2009 17:41 by ckincincy

So at work I just finished up a rather large project.  It is live and customers are using it.

What I am going to try and do is recap some of what I’ve done and learned over the past few months so others can learn, and I can easily recall :-)

Some of the post will be rather trivial in nature, so bear with me.


Abusive Spiders - GateKeeper

January 11, 2009 15:39 by ckincincy

image

Chris Blankenship has been on a crusade lately about abusive spiders.  I was interested in some of the fixes he was applying to it, but a few weeks ago I got an email from him about a solution he was developing, ‘GateKeeper’.  I reviewed the code and it all looked good, but he wasn’t ready yet to fully release it into the wild.

It finally got to that point and I installed it on my two DotNetBlogEngine.net blogs. So far I have been really impressed with it.  I’m really interested to see how it affects my overall traffic.  Right now I have four blocked user agents:

baiduspider, larbin, sogou, sosospider.  All of those came from Chris’s recommendation.  Then I immediately got a Slurp violation, though I am going to give them one more failure before I block them.  Chris also has MSN blocked.  A lot of my traffic comes from Live Search, so I’m a little scared to do that.

I did fall on one issue with the solution though.  When I installed it, I had it set to automatically block violators.  Unknown to Chris and I is that Google caches the robots.txt file!  So since they didn’t get my new robots.txt file, they were blocked!  So it is recommended to not turn on the automatic blocking for at least a few days.

Related post from Chris’s site:
The Continued Struggle With Spiders
To catch a spider…
Abusive Web Crawlers
Blocking Bad UserAgents and IP Addresses
The elusive Robots.txt file


Is 800 x 600 finally dead?

November 8, 2008 15:50 by ckincincy

OK, I need your help.  I want to know what your stats say about your visitors.

I have been giving some consideration to redesigning my site and was wondering if the restriction of 800 x 600 screen resolution was still an issue. 

So I looked at stats from a few sites I run and I think the answer is yes.

Sites two and three are church web sites, so the statistics are a bit skewed toward an older crowd, and even those are low.

Site 1
site1

Site 2
site2

Site 3
site3

So what does your stats show?


'.', hexadecimal value 0x00, is an invalid character. Line 1, position...

October 5, 2008 16:00 by ckincincy

Last week we upgraded our company from Sybase to SQL Server 2005.  All of a sudden one of our web applications started throwing this error:

'.', hexadecimal value 0x00, is an invalid character. Line 1, position 198128.

Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.

Exception Details: System.Xml.XmlException: '.', hexadecimal value 0x00, is an invalid character. Line 1, position 198128.

This was happening when I was calling a .NET web service via an ASPX page.  The web service call returned a serialized object:

For understanding, I have a class similar to this:

Model.Customer cust = new Customer();
cust.Id = 1;
cust.Name = "CK IN CINCY";

Now for the code that gave me this error (ie the ASPX code behind):

Model.Customer cust = new Customer();
WebSvc.GetCustomer(1, out cust);

This would blow up.

The web service looked like this:

[WebMethod]
public int GetCustomer(int Id, out Model.Customer cust)
{
     // Fill cust object from database
     return 1;
}

The problem was the data in the database contained null characters.  This is to understood as different than null records.  A null record is self explanitory and can be deal with inside the sql with an isnull call.  The problem was that the record existed but somehow has a null character.  CK IN [NULL CHARACTER] CINCY.  In this case, isnull won't solve the problem.  So after about five hours of searching I found this post.  Which contained this function, which did the trick for me.  Hopefully this saves you 5 hours of your time :-).

 

/// <summary>
/// Removes control characters and other non-UTF-8 characters
/// </summary>
/// <param name="inString">The string to process</param>
/// <returns>A string with no control characters or entities above 0x00FD</returns>
public static string RemoveTroublesomeCharacters(string inString)
{
    if (inString == null) return null;

    StringBuilder newString = new StringBuilder();
    char ch;

    for (int i = 0; i < inString.Length; i++)
    {

        ch = inString[i];
        // remove any characters outside the valid UTF-8 range as well as all control characters
        // except tabs and new lines
        if ((ch < 0x00FD && ch > 0x001F) || ch == '\t' || ch == '\n' || ch == '\r')
        {
            newString.Append(ch);
        }
    }
    return newString.ToString();

}

Telling a client no

March 15, 2008 16:43 by ckincincy

I've helped a local company do their web site over the years.  They have come to me several times with 'great ideas' for their web site.  From 'nice' blinking text to hiding text at the bottom of the page to 'fool Google'. 

When they come to me with these idea's I fill them in on why that is either a bad practice or a downright bad idea. 

And this isn't just this client that I've had these kinds of conversations with.  Most highly respect my technical skills and accept my answer.  Sometimes they listen to my reasons and decide differently.  In the end they are the boss. I've only had one client that seems to constantly override my thoughts, frustrating for sure. 

But this local company came to me with another 'great idea' and I again told them why it was bad.  So they decided to go with somebody else.  Honestly, kind of relieved.  But I wait for the day that they face some of the consequences of not having somebody willing to tell them no.  When Google's spider realizes the hidden text and drops them from their index.

So I guess the bottom line is, feel free to tell a client no.  But realize they are the boss!  And if it is in your power, and them telling you no just frustrates you to no end... drop them as a client.  Having them make you design a bad site or application isn't worth your reputation.


Develop for Standards

February 24, 2008 19:33 by ckincincy

We are on the verge of two more browsers being released, IE8 and FF 3.  I've tested FF3 (as it was very easy to install without screwing up my current configurations), and have read up on IE8.  Both are aiming to be standards complient, FF 3 currently passes the Acid test, and IE 8 is reported to pass it.

Now here is the issue, IE 8 is not standard by default, you have to add a meta tag to your website saying it is compliant.  The thought is that they didn't want the web broken by the new browser.  

So here is my call to all developers... break the stinking web.  When you develop a site, make it 100% compliant to the standards and let the users stuck on old software deal with it.  We need to stop having to create hacks to work with the various versions of web browsers.

When I started out many years ago working in ASP I used GoLive; which my coworkers and I had two bits of fun with:
1. Had this saying: Be like Jesus, save often.
2. Called it GoDead due to its many crashes.

But GoLive was not standard compliant at all.  Now that I do my development in Visual Studio I make sure that all my sites are standard compliant.  So they should render perfectly in the new browsers.

But for those of you who develop websites, I repeat my plea: BREAK THE WEB. 


CreativeMYK.com

January 13, 2008 17:06 by ckincincy

I do a fair amount of image creation for my church web site.  So I was happy to learn about CreativeMYK.com and repository of royalty free images for church audiences.  It is a new site so it should have more to offer over time in way of images and in features (currently searching is painful)

Credit to: Joel Young

Now as an addition I also use two other sites for my sources:

www.sxc.hu - A site with user submitted images.  Licensing varies but most images are free to use.  Their search feature is pretty good.

www.creativecommons.org - This is a search site that links a few sites together and you can check a box that will limit your search to royalty free images only.  Nice way to find good photos.


CSS Only drop down menu

January 8, 2008 21:36 by ckincincy

I've always wanted to find a pure CSS drop down menu, but never could find one that worked on the major three browsers (Firefox, IE 7, IE 6).  Until now!  CSS play has a purely CSS menu that I've confirmed to work on those three browsers. 

It is real slick and works without JavaScript. 

You can see it in use on my churches web site.


Testing If-Modified-Since header on Windows

December 14, 2007 21:52 by ckincincy

As my previous post alluded to, my blog has had an issue with FeedBurner.  Basically their site is not updating my feed when my scheduled post become visible. 

Matt Shobe (yep, co-founder of FeedBurner) told me that my If-Modified-Since header was inaccurate, and a potential cause of the issue.  So I went on searching for a way to test this header on my Windows XP box.  Couldn't find one, so I emailed Matt and asked him if he could point me in the right direction.

He said you could use Linux (don't have a version installed) or I could possibly use Cygwin.  And I can!  So I grabbed the latest version and looked through all the packages and found two that had curl in them (in the Net component).  Let Cygwin install then ran this line in the command line that appears when you launch Cygwin:
curl -H 'If-Modified-Since: Wed, 12 Dec 2007 05:59:00 -800' -IL http://www.ocdprogrammer.com/myfeed.axd (don't go to it, it isn't valid)

And sure enough my BlogEngine had a bug in it.  I've since fixed the bug and hope it resolves the issue.  We will find out in the morning as my LifeBlog has a post scheduled for six in the morning.

I am blown away that FeedBurner is as helpful as it is when it comes to support.  Thanks Matt!  Because regardless of whether or not this bug is the cause of my FeedBurner issue, it was a bug and now the BE platform is better since it is fixed.