Unibet Privacy Proxy for Firefox and Internet Explorer

A little over a year ago, I came up with a neat idea how to bypass any potential blocking of Unibet’s websites.

  1. As of today, we’re running this in production and there is an updated version of the Firefox add-on.
  2. The big news for Internet Explorer users, and the users of our Poker software, just run this script active the proxy settings to ensure functionality.

I hope you will find this useful!

RTWaaS?

A giant hurdle for buying a system/solution as a software is the need to buy hardware, install it, configure and manage it. You need to train people on the products’ operational aspects and retain that skill within the company.

(Free) Open Source Software (FOSS) is great to spread, to get adoption and support for a product. You enable the developers and architects to play around with the stuff! The real challenge for FOSS (and other software) products is to go beyond the happy and content developer and also provide a painless path for the adopters to provide business value without a huge investment hurdle in terms of hardware, software, traning or services.

I think the reason why something like Google Analytics or Salesforce.com is successful is that it is extremely painless to start using it. You can focus on the business problem rather than the IT stuff. Obviously this is nothing new, and the examples I gave has been around for years. Software as a Service is great.

Then, you have all the talk about the real-time web and putting information quickly, as it happens – “real time” – on the users’ desktops. This is what Twitter and Facebook is about, but real-time web is also needed for e-commerce and gaming and a lot of other areas. There are even conferences about it, so it must be happening ;)

Lastly, the final piece of the puzzle are Service Level Agreements. In order to provide “real time web” messaging as a service there is a clear advantage of being close to the information consumers, both in terms of scaling out and in terms of guaranteed latency. I think it is going to be hard to commit meaningful SLA:s without being in the edge.

If you remove the need to invest in infrastructure, the need to train people on the operational aspects and then get excellent scalability and low latency guaranteed by contract, I’d buy it in a second. Who will provide me with the Real Time Web as a service?

Web pages are disappearing?

I believe the page (url) is becoming more of a task oriented landing area where the web site will adopt the contents to the requesting user’s needs. I believe the divorce between content and pages is inevitable. It will be interesting to see how this will affect the KPI:s, analytics tools we currently use and search engine optimization practices going forward.

I recently attended a breakfast round-table discussion hosted by Imad Mouline. Imad is the Chief Technology Officer of Gomez. For those who aren’t familiar with Gomez, they specialize in web performance monitoring. It was an interesting discussion with participants from a few different industries. Participants were either CTO:s or CTO direct reports.

Imad shared a few additional trends regarding web pages (aggregated from the Gomez data warehouse):

  • Page weight is increasing (kB/page)
  • The number of page objects are plateauing
  • The number of origin domains per page are increasing

We covered a few different topics, but the most interesting discussion (to me) was related to how web pages are being constructed in modern web sites and what impact this has on measuring service level key performance indicators (KPI:s).

In order to sell effectively you need to create a web site that really stands out. One of the more effective ways of doing this is to use what we know about the user to contribute to this experience.

In general we tend to know a few things about each site visitor:

  • What browsing device is the user using (agent http header)
  • Where the user is (geo-ip lookup)
  • What the user’s preferred language is (browser setting or region)
  • Is the user is a returning customer or not (cookie)
  • The identity of the customer (cookie) and hence possibly age, gender, address etc :)
  • What time of day it is

So we basically know the how, who, when, where and what’s. In addition to this we can use data from previous visits to our site, such as click stream analysis, order history or segmentation by data warehouse analysis fed back into the content delivery system to improve the customer experience.

For example, when a user visits our commerce site we can use all of the above to present the most relevant offers in a very targeted manner to that user. We can also cross-sell efficiently and offer bonuses if we think there is a risk of this being a lapsing customer. We can adapt to the user’s device and create a different experience depending on if the user is visiting in the afternoon or late night.

If we do a good job with our one-to-one sales experience, the components and contents delivered on a particular page (url) will in other words vary depending on who’s requesting it, from where the user is requesting it, what device is used, and what time it is. Depending on the application and the level of personalization, this will obviously impact both the non-functional and functional KPI:s: What is the conversion rate for the page? What is the response time for the page?