I have just returned from the e-Agenda Summit with a mind full of interesting stuff. I will be posting over the next few days but an interesting observation was that despite the e-Learning being the focus of the summit there was little or no support for me, the e-Learner! That is no power points for my Mac (I need these as my battery is only holding 45 mins at the moment), no wireless Internet access except for 10 minutes from the conference next door (dentistry I think) started a network up, but they promptly came round and told us to keep our hands off! Ho hum.
Anyway, for a first post some feedback on a seminar called Next Generation Network Services. Sounds dry, first part was impressive conceptually, the second seemed to have immense potential for online learning.
The first speaker, was Julian Lombardi was planet-lap and he painted a picture of what comes next. The argument ran that currently the Internet is the same architecture now as it was in 1984 (ARPANET) and that the demands now put upon it by the billions of instances of use means that something new is required. The way that the Internet is currently structured has one big flaw and that is when you need a service so do I and everyone else at the same time. This leads to a failure of the Internet of either a catastrophic form where services don’t work at all or something less dramatic where things just slowly grind along.
What is needed is intelligent routing and this can be delivered through a set of servers using the current Internet for transporting data and jointly holding multiple sets of data to allow for the ‘intelligent’ network to make choices about where data is accessed and retrieved from. This is one such server at Princeton University and it is part of the planet-lab network of interconnecting nodes.
The example used being email, currently this sits on a server somewhere (your ISP) and if this fails you have no email, but if this were simultaneously located in places around the globe whenever you needed it would be readily accessible. Your email needs to be stored somewhere, but not at a particular place! So what about data integrity, well this was explained by using the analogy of time and clocks. As we move around day to day we come across many different time pieces and although we probably don’t trust any one 100%, we know that by a process of ‘mediation’ between them we have a robust and trustworthy system overall. This sounded complex to me, but was explained as possible because of the decrease in the cost of both powerful servers (linux based), and the shared nataure of the project, that is to join you have to commit a server and a connection. It reminded me of the SETI project that harnessed the power of individual’s computers.
This is a very much a non-technological take on all of this. I am always aware of the hype that surrounds these things and no doubt other groups are also working on the second generation Internet along this and different lines.
Hopefully the lunch was more appetising!