Aug 01

HOWTO: Using Paypal buttons with ASP.net the easy way

A couple of years ago I wanted to add a donation button to an ASP.net website. Specifically I wanted a Paypal based donation button. It should have been really easy. Paypal will let you make buttons on their site and they’ll generate the code for you to put onto your own web page. But unfortunately… it’s just not really that easy.  I recently needed to do this again (just today actually) and couldn’t bring myself to use one of the old hacks I had done, and I finally found a super easy solution. I don’t understand why it was so hard to find this information, but I decided to write it up for all to see and use.

The problem is that the code that PayPal generate is html form code. The “action” of the form is a page on their website. The reason that this is a problem is because most people use masterpages within their ASP.net webpages and those masterpages typically have form tags already in them. Since webpages are only supposed to have one “form” per page then adding the PayPal generated form in the content area of a page that is wrapped within a form on the masterpage is just not going to work. What happens is that when the user clicks on the PayPal submit button it just posts back to local website instead of posting to PayPal. Very annoying.

I read about 20 web pages / tutorials today about how one can use PayPal generated buttons within an ASP.net, and I was about to finally give up and do it the old way when it occurred to me that maybe I could control the current page’s masterpage’s form’s action programatically. (By the way that was the most possessives in a row ever!)

So here’s the deal, and it is so easy. Let’s say that the PayPal code generated for you looks like:

<form action="https://www.paypal.com/cgi-bin/webscr" method="post">
 <input type="hidden" name="cmd" value="_s-xclick">
 <input type="hidden" name="hosted_button_id" value="blablablabla">
 <table>
 <tr><td><input type="hidden" name="on0" value="Subscription Level">Subscription Level</td></tr><tr><td><select name="os0">
 <option value="Pro">Pro $5.00 USD</option>
 <option value="Allstar">Allstar $15.00 USD</option>
 <option value="Superstar">Superstar $25.00 USD</option>
 </select> </td></tr>
 </table>
 <input type="hidden" name="currency_code" value="USD">
 <input type="image" src="https://www.paypalobjects.com/en_US/i/btn/btn_paynowCC_LG.gif" border="0" name="submit" alt="PayPal - The safer, easier way to pay online!">
 <img alt="" border="0" src="https://www.paypalobjects.com/en_US/i/scr/pixel.gif" width="1" height="1">
 </form>

Go ahead and paste the generated code into your page wherever it is you want it to go. You don’t have to jump through any hoops here. Now go to your codebehind and add a couple of lines to the “PageLoad” method for the page in question. Here’s a super simple PageLoad with the two new lines:

protected void Page_Load(object sender, EventArgs e)
 {
 if (!IsPostBack) 
 {  } 
 else  {  } 
 Form.Action = "https://www.paypal.com/cgi-bin/webscr";  
Form.Method = "post"; 
}

That’s it! What we did was we just programmatically set our Masterpage’s form’s action and method to whatever the PayPal form’s action / method was. So, now, when you click on the PayPal button your MasterPage form will still be the one that handles it, but it will handle it the exact same way that the PayPal form would have. Enjoy!

Please let me know how it works for you!

Jan 14

Website color charts

Anyone creating websites has come across a need for color charts. Whether picking a background, a font color, a border color, or any other colored control / feature it is crucial to have some sort of reference to work from. I’m not that artsie (sp?) so I don’t have the color codes memorized due to high use, but I also hate going and looking for decent ones.

As I find decent charts, I’ll add them here.

The main reason I like this first one listed is that it has not only the codes, but it shows me pictures of each of the CSS standard color codes. I am often coding away and am trying to pick a color but the intellisense only has the color NAME… not a sample next to it. Now I can whip out my handy dandy color chart via the link below, find the color I want visually, and then use the name specified next to it. Life is good.

Color Chart: http://www.neopets.com/~triflot

Dec 26

Link checker – Bad neighborhood

I often get requests for me to add links to my sites. Usually it is just someone looking for something simple that will deliver them some relevant traffic.
What I have found though is that one should ALWAYS verify that the link destination is okay. It should not be in a bad neighborhood. In addition, it should not link out to bad neighborhoods. These bad neighborhoods will get sites that link to them penalized in the search engines.  That’s right – the sites that you link to can get your site penalized. Not only that, but the sites THEY link to might get your site penalized.
The link below has a bad-neighborhood checker. It will scan a URL and determine if there are questionable links. Then it will scan the linked to pages to see if any of their links are questionable in nature. It’s a great little tool and I highly recommend using it.

Dec 26

Don’t try to beat the search engine

I just read the following article while I was trying to determine if static named pages are better for seo than those with parameters in the url. I’m always impressed with the things that are returned when I google something. It is often not entirely relevant to what I was looking for, but can be very interesting anyway.

If you get a chance and you are interested in SEO at all, you might give the following a read:

http://www.stonetemple.com/articles/top-10-bad-SEO-ideas.shtml

From the article:

===============

So what’s the bottom line? There are really two major things you need to do:

* Learn how to communicate to the search engine what your site is about. Many of the problems listed above relate to common practices that make the search engine’s job harder, or even impossible. Learning how to build your site so that the search engine can easily determine the unique value of your site is an outstanding idea.

* Don’t spend your time figuring out how to beat the search engine. It’s just not a good place to be. You may even succeed in the short term. But if you do succeed in tricking them in the short term, the day will come when you wake up in the morning and a significant piece of your business has disappeared overnight. Not a good feeling at all.

Take the same energy you would have invested in the tricks and invest it in great content for your site, and in the type of marketing programs you would have implemented if the search engines did not exist.

This is how you can grow your business for the long term.

Feb 22

Give Them what they want; not what they ask for

I was reading this article on slashdot the other day and it occured to me how often I made a particular mistake when I first started programming. I created a perfectly sound little app for someone and then when they complained I modified it to what they asked for. But that is often the wrong reaction. What we should really do is determine what the real problem is and how to address it. Sometimes the user that is complaining has enough power to force you against you will even if you know the “fix” they want is a bad idea, but often we have enough autonomy to be able to come up with a happy medium.

The example from the article was in a game environment. Some users would get stuck. They wanted a way to get a hint so they could move on to the next level. The fear of the game writer was that if he offered a hint the users would “give up” too quickly and just take hint after hint. The users asked for hints, but he thought it was a bad idea. The solution? Limited hints OR hints that take away points from your score OR hints only after a certain amount of time elapses. In any case it was possible to give the user what they wanted without really giving them what they asked for.

Feb 15

PageRank – What it takes to get to the levels you want

It seems to me that everyone out there who intends to monetize their website wants to know how to get their page rank to a certain level. I can tell you now I have no idea what the answer is to these questions. My best advice is to create useful content and market yourself. Hopefully people will find you interesting enough to link to and that will push up your rank.

Now, because I know that even for me that answer is not “good enough” and I still want to know “how many links do I need to be a PR X” I have collected the following posts from other boards on the subject. Each post will be preceded by the board in which it was found.

Source of the following two: http://www.webmasterworld.com/forum30/35069.htm

Hi all,

Regarding the “how many links” questions, I’ve found a page at seogeeks.com with the following claims:

To get the PR you want, you need about 18 links from pages with the same PR, assuming 50 links per page.

So if you want a PR7, you’ll need links from 18 PR7 pages, assuming each of these pages has about 50 outbound links. Alternatively, you could have 3 links from PR8 pages.

Well, that’s just what they say. I don’t know how current that information is. And can’t remember the exact url 🙂

and…

As a rough guide (assumuing there is an exponential factor of 10 between PR levels) you can say that PR levels need a certain amount of PR ‘points’ passed to them from other pages:

PR1 needs 1 ‘points’
PR2 needs 10 ‘points’
PR3 needs 100 ‘points’
PR4 needs 1000 ‘points’
PR5 needs 10000 ‘points’

9The exponential nature of PR explains why getting to PR 9 and 10 is so tricky)

PR ‘points’given out by a page, again this is a simple example, can be roughly calculated by dividing the above values by the number of outbound links.

e.g. a PR3 with 20 links gives 5 ‘points’ to each page, this would move a page with no other inbound links to a PR1

Assuming that the pages that you get links from have an average of 20 links you would need 20,000 links from PR2 sites to get to a PR5. Likewise 2,000 links from PR3 sites would be required.

The good news is that you would only need 2 such links from PR6 sites to make it to PR5.

There are many more factors to consider such as dampening, the actual PR of a page can be PR?.0 to?.99 and make links worth different amounts from what appears to be thesame PR value etc.

I hope that helps

I like both of those answers. And no I didn’t do the math to see if they are in agreement. But at a glance it is easy to see that the higher rank a page is that links to you the more valuable it will be. Mathematically this is not exacly true (or perhaps more precisely it is not ALWAYS true, but it is USUALLY). Try to get links from others with high ranks. Or get gazillions of links from lower ranked sites.

Here’s a final comment from the same site:

Questions about “how many links” are almost impossible to answer because of the way Page Rank calculation works — the math behind is far from linear.

Here’s a thread that took the Page Rank equation apart to a degree – perhaps it will help you understand why there is no easy answer for your question.

http://www.webmasterworld.com/forum30/34079.htm

Oh yeah, and supposedly pagerank only updates once a quarter or so. Bah.

Nov 19

Fix AJAX Error – PageRequestManagerParserErrorException: The message received from the server could not be parsed

===== Update: Clarified need for global.asax file and contents =====

A fantasy sports (fantasy basketball / fantasy football) website I do some work for had some Ajax related issues some time ago. I recently came across the problem somewhere else and figured it might make sense to write it up.

There were error messages that were not consistent. In testing a page there might not be any problems, but then when it goes to a production server the problem shows up. Or maybe it does not have a problem when being accessed from one location, but then another user does have a problem. If you have these sorts of symptoms I’m here to tell you it might not be Ajax’s problem. Instead, we may need to blame the firewall. Typically the ajax error will say something like :

“Sys.WebForms.PageRequestManagerParserErrorException: The message received from the server could not be parsed. Common causes for this error are when the response is modified by calls to Response.Write(), response filters, HttpModules, or server trace is enabled. Details: Error parsing near ‘ … [some more here] …

Some firewalls do not recognize AJAX headers. This causes the firewall to alter the message (by removing the header) that our servers send to the browser viewing the page. When that happens, the user’s browser cannot make sense of the response and gives the error mentioned above. We actually had users who were able to convince their network admin to disable an option on their firewall (temporarily) that is typically called something like “REMOVE UNKNOWN HEADERS”. When the option was disabled, the site functioned normally for them.

Unfortunately, disabling the users firewall is not a viable solution. One recommendation might be to have the admin “tell” the firewall about the header so it will recognize it and quit removing it. Depending on the type of firewall (a product called Watchguard was the offender in our test case) there may be a way to make certain headers (the ajax header) known rather than disabling them all. There is nothing to “fix” for us as it is more a flaw with the firewall than anything else; if you are encountering this problem you will need to work with your network admin on the problem.

All of that being said, we found that it can also often be fixed by code on the website end! That is a much better option, yes? So here’s all you have to do.

<asp:contentplaceholder id=”ContentPlaceHolderCodeForAjaxStrippingFirewalls” runat=”server”>
<script language=’javascript’ type=”text/javascript”>
function beginRequest(sender, args)
{
   var r=args.get_request();
   if (r.get_headers()[“X-MicrosoftAjax”])
   {
      
b=r.get_body();
       
var a=“__MicrosoftAjax=” + encodeURIComponent(r.get_headers()[“X-MicrosoftAjax”]);
       if (b!=null  && b.length>0)
       {
             b+=
“&” ;
       }
       else b= “” ;
       r.set_body(b+a);
     }
}
</script>
</asp:contentplaceholder>

Note: I put this inside a contentplaceholder (I am using masterpages) so I didn’t have to include it manually on every page. If you are not utilizing masterpages then you could just put the script on each page that uses ajax. 

Now, we need something to call this. I put a call to it in my global.asax file within the method Application_BeginRequest:

void Application_BeginRequest(object sender, EventArgs e)
    {
        // first  event in the pipeline.
        // I’m going to use this to try to intercept ajax headers when they get all messed up by firewalls
        // I got the information about how to do this the below url:
        // http://forums.asp.net/p/1144748/1850717.aspx
        HttpRequest request = HttpContext.Current.Request;
        if (request.Headers[“X-MicrosoftAjax”] == null && request.Form[“__MicrosoftAjax”] != null)
        {
         ArrayList list = new ArrayList();
         list.Add(request.Form[“__MicrosoftAjax”]);
         Type t = request.Headers.GetType();
         //lock (request.Headers)
         //{
            t.InvokeMember(“MakeReadWrite”, System.Reflection.BindingFlags.InvokeMethod | System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance, null, request.Headers, null);
            t.InvokeMember(“InvalidateCachedArrays”, System.Reflection.BindingFlags.InvokeMethod | System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance, null, request.Headers, null);
            t.InvokeMember(“BaseSet”, System.Reflection.BindingFlags.InvokeMethod | System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance, null, request.Headers, new object[] { “X-MicrosoftAjax”, list });
            t.InvokeMember(“MakeReadOnly”, System.Reflection.BindingFlags.InvokeMethod | System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance, null, request.Headers, null);
         //}
        }
    }

 

Do those two things and and the problem goes bye bye, and the network admins get to keep their silly little rules in place.

Note2: I found the hint about what might be causing the problem here. I’m not entirely sure where I found the fix… it was a long day thank you… but it works.

Nov 07

Browser Statistics (browser type and display resolutions)

When designing a website or web tool one should make sure they design the site around how the users browsers are probably set up. For example, if design the site width around being 950px wide, and the user has their settings at 800×600 then your site will not fit on their screen. That is bad.

I know what I like mine to be and I generally just assume everyone else has theirs the same as mine. It’s not a good practice, so I’ll recommend a better one. Use the following information so you can atleat like your odds.

General browser stats:

http://www.w3schools.com/browsers/browsers_stats.asp

http://www.w3schools.com/browsers/browsers_display.asp

The first one tells you the types of browsers in use. The second one tells you the display settings in use.

Nov 07

Learn CSS

I do not teach CSS. I link to CSS tutorials. Hopefully you can learn something there.

Okay, fine. I’ll teach one thing:
In CSS, speficially in style.css, the pound sign (#) is how you address a DIV with an id. The period is how you address a DIV with a class. For a class example, if your markup is

then use .wrapper instead of #wrapper to address the wrapper DIV.