Feb 05

Google and Latitude

Google is confusing me at the moment with its execution of Latitude and its multiple versions. If I type in google.com into the address bar, it defaults to google.co.uk and gives me this for Latitude

iGoogle UK version

When I type in google.com/ig (or click on the link to m.google.com provided in the notification emails, it keeps me on the .com TLD and gives me this:

iGoogle.com Latitude

It’s obviously available…so why does it pretend it isn’t? And what’s with the changing URLs?

Sep 24

Advertising or Subscription

I just twittered that I was off to the IAB MIXX conference today and Tom Morris responded, (only partly tongue-in-cheek) that he thought they’d been replaced by Ad Block. Which leads me to my question. If, hypothetically, we all had Ad Block so that no ads were served how much would you pay for your services, starting from Google Search, Gmail and other services, including Yahoo and 1000s’ of other small services that are ad supported. As a starting point, I pay $25/year for Flickr, for additional services and no ads. I’d probably pay the same for Gmail – but is search worth more or less?

Jun 19

Localisation

On June 12, Flickr introduced localisation – that is local language and country versions – of their site in 7 different flavours. These were French, Germany, Italian, Korean, Portuguese, Spanish and Traditional Chinese. In what appears to be a consequence to this extension, they also ran up against local laws (or interpretation of laws) that meant users in Germany, Hong Kong, Korea and Singapore were restricted to safe search only, resulting in a storm of comment (most vocally from Germany) accusing Flickr of censorship.

And now YouTube have announced their localisation, with an extension into 9 more domains. As more than half their users are outside the US, it’s probably about time.

Today at a Google press event in Paris, Chad Hurley and Steve Chen are announcing the launch of nine new domains in Brazil, France, Italy, Japan, the Netherlands, Poland, Spain, Ireland, and the UK.

All of the language has been translated (and the UK/IE versions are different to the US – they’ve corrected the spelling!) and all are on new URLs. They have not touched the countries/languages that have given so much trouble for Flickr and have some interesting gaps. There also seems to be less advertising (currently) on the local sites; however, local content distribution deals have been done in these markets to add to the ‘professional’ content on the site which will be a revenue stream and I’m sure advertising will catch up.

What I can’t seem to find out is whether uploading to one version makes your video available to the other versions. The home pages are localised as to content – they are the result of an editorial decision. The browsing and listing pages are also localised, there are different in different markets. Looking at the honour listings for some of the video it looks like the US version has been set to the global resource and then each market has it’s own honour listings. Which seems to mean that whilst I can tell how a video would be doing in somewhere like the UK, I can’t easily pull out the US figures. Does this mean all the video ranking sites are going to be changing their results over the next few weeks?

Apr 16

Google Speaker Event – Luiz Barroso

Last week I attended the latest in the NYC Google Speaker event series: “Luiz Barroso, Google Distinguished Engineer, will talk about “Watts, Faults, and Other Fascinating Dirty Words Computer Architects Can No Longer Afford to Ignore”. Luiz was talking about things way beyond my skill or experience but I still got some great insights into designing hardware and infrastructure. That’s mainly why I go to things like this – it’s new information, just expands what I know a little about. So here’s some notes from the talk; there was a lot more to it than I noted.

  • power and energy usage have not been very sexy when it comes to designing architecture and that has caught up with people. It is now the centre of plenty of attention.
  • In the 90’s there two big research areas, the MHZ race and the DSM (Distributed Shared Memory) race. The first for accelerated single thread performance and second to improve the efficiency of shared memory
  • Moore’s law is fundamentally about transistors. The issue is becoming power; they are energy wasteful and temperature control is difficult. Power costs are increasing and look like being more expensive than the hardware. It may tend towards the Mobile model, where you get a energy contract and then the hardware thrown in for free.
  • they are focusing on reducing conversion losses and improving power conversion. On PCs the power supply consumes much of the energy, with 55-70% efficiency. .
  • Multi-core processes help reduce energy use. You need to design software differently to take advantage of it, building efficient concurrent programs.
  • Google has been monitoring diskfailure. Common wisdom is that failure rate is <1% and temperature is a big factor. So we looked at 100k+ drives over 5 yrs. Failure rates were ~8% after 2 years, all way larger than manufacturers rates and temperature did not appear affect the rate. Trying to find a predictive algorithm has had little success; more than half the disk failures happened with no indicative errors and the arrival of errors did not indicate time to failure. The models are good for predicting population wide trends, ie predicting how many failures you will have and how many replacement disks you need. And also for telling you that temperature does not matter that much.
  • Looking at power requirements, the average data centre costs $10-22/watt used, whereas US average energy costs $0.8/watt/year. It costs more to build a data centre than to power it for 10 years. YOu have to optimise energy usage to be close to capacity, thinking about power provisioning, how many machines can be used, the unused watts cost.
  • Studying power usage, we found the data centre never hit peak capacity, even if a rack on its own could have. A PC uses about 60W at rest, 120 at full usage; a human uses around 60W at rest and 1200W at high usage. We are far more efficient – machines have a factor of 2 between idle and peak, humans a factor of 20. To improeve energy efficiency for data centres, we should focus on reducing the usage of idle power.
  • So by reducing the idle power, with no change in peak, you can get a 40-50% savings. You can reduce the peak power requirements for the data centre as a whole by reducing the machine idle consumption.
Feb 02

Google Speaker Event – Adam Bosworth

The second event on Monday was the first in 2007 Google Speaker events. This looks to be the start of a series of events that Google is planning in their new space in New York. And a good, large space it is, even if painted a depressing battleship grey. But the lovely spread of food and the free beer or wine made up for that 😉

The first speaker was Adam Bosworth, talking about Physics, Speed and Psychology. Although as he explained it the title was designed to try and get the organiser to cancel the talk! He initially thought that the target audience was going to be a small group of Googlers and had written it up that way, but did not change it when told it was to a far larger (about 150?) external group.

Nothing new or ground breaking here, Adam just talked through some of his history, trying to put explanations into why things failed or succeeded. Overall, extremely enjoyable, espeically about the history of Ajax, even if he felt he was moving occasionally into the area his girlfriend had described as POF territory, ie pontificating old fart.

In 96/97 Adam was working on a team that was developing DHTML, know known as AJAX. As far as they determined, the reasons for moving this way were sound = it was strategically well thought out as it looked like the internet was moving applications to a thin client, not the thick client beloved by Microsoft at the time (so the team was not exaclty the most popular with their message) and people were going to need a high level of interactivity on the web, similar to the current Windows apps. So they went ahead and built this office package, with spreadsheets and presentations and word processors but no one used it. Their assumptions were completely wrong.

The companies they were trying to sell this to did not like it (Adam paraphased their reaction as ‘stop developing this stuff, go away, we hate you). The rich functionality of the apps meant that it would need a high level of support, of which very few of the customers had. In 1997, web apps were used occasionally, not day day out as office aps were. Therefore the barrier to learning it in a useful way meant that everything needed to be extremely simple to reduce the need for support; it needed to be intuituve so that if you only visited it a few times a month or less, you could still work it. The other barrier was speed; to build any of these apps in JS takes a lot of JS, a lot of bandwidth and the connections, and chips, were too slow. The delay and unpredictability of the response meant it was never liked.

Unsurprisingly, anything over .5sec reaction produces frustration; the best reaction time is less than this but it appears that people can live with half a second. Variations in network speed also meant that the reactions of the apps were extremely unpredictable, the system did not work consistently so turned people off very quickly.

Ten years later though and Ajax is well into its second life. The physics has got better, bandwidth is far, far faster and so are the chips. Carefully crafted applications are fast enough for people to use (although if you’ve tried Yahoo TV guide you can see it’s easy to build something that is a step backwards in functionality. The other psychological change is that people are using web apps far more; the increase in use frequency means that the interfaces can be richer.

Next up for examination was the PDAs, which have gone through a similar life cycle. The first iterations were complicated, pen computing, requiring writing recognition which was unpredictable and did not work very well. The second iteration carried on with the writing, but asked that the human learned to write in a special language. Whilst still unpredictable and slow, it was an improvement. The third iteration, the Blackberry and Treo just decided to go with the keyboard. You can work faster, it’s predictable, there’s no translation delay between input and it appearing in the screen.

The slowness and lag is why mobile browsing has been slow on the uptake. SMS, the simple text message, took of by accident. It was built for testing but with it’s simplicity, speed, asynchronicity helped spread usage. Now physics has caught up with the vision on mobile computing and browsing on a mobile device is practical and not too painful.

The final area looked at was natural language, which has failed so many times. If you can ‘talk’ to a computer in your natural language you expect it to behave like a human and demand a precision that is no there. But it’s resurrection came in part from Microsoft Help, which needed some way to help people navigate it’s vast depository of help documents. Search, with it’s fuzzy logic, is perfect for natural language, it works when it doesn’t really because it is a time saver, it filters the noise out there.

So the key lessons Adam has learnt?

  • Think about people’s activities
  • Determine the frequency of use; the less it it used the more simple it needs to be.
  • If it takes greater than 2 seconds to perform a task, break it down to smaller chunks – don;t make people wait

A fun talk, full of anecdotes and some useful advice. Google are planning to do a fair few events from the sound of it, so plenty to look forward to.

Nov 25

Jury’s Hotel SEO

If you use Google to search for Jurys hotel, the first result is the main page for the hotel group.  But the next few pages demonstrate the powerof a Search Optimisation strategy and 15 of the next 19 results are also jurys, but lying across a range of subdomains such as bristolhotels.jurysdoyles.com.  But if you explore further into the individual city searches, ie using london hotels or bristol hotels they are more hit and miss in their results.  You get a similar result when you look at Yahoo or Live Search.   They appear to have 3 or 4 different domains and sub-domain structure, which takes time and effort to manage.   This must be one of the more blatant uses of SEO, I wonder if the business objectives are being met?

Feb 23

Google Page Creator

This did not get off to a good start…having default purple layout is enough to make me quit the screen very quickly.

GoogleHome.jpg

And when I went to change the colour scheme, it decided to throw a wobbly.

Googleerror.jpg

Second time round, it behaves…and so I produce this. Very bad, no thougth what so ever, but in 5 minutes I have a website that I could, if I had the time, put some real content on..although it appears only if the content were words and pictures. But for that, it could not be easier. I can add text and images and drag them around. I can switch styles and layouts with a click of a button. This is a simple way of getting someone their first website up and running. What it won’t do of course is let people learn how a website is put together…but I can see my family loving this.

Dec 20

Google Zeitgeist

Google releases its review of the year, looking at the trends in searches. In the top gainers of the year, there are 2 cultural references which stand out from the other web/media terms. The first is Leonardo de Vinci, I’m guessing driven from the Dan Bown book and the Spielberg film. The other is Green Day – of course, the increases interest in that band could not have been driven from the mashup album American Edit 😉

They’ve picked a number of different items to highligh… just look at the increase in searches for wiki.

phenom3.gif