Sherlock Holmes in the Cloud

The launch of Apple’s iCloud gives an opportunity to stop up and reflect on cloud computing, and its relationship to nanosynchronization.

At one level, the Cloud is just a return to centralized computing (mainframes) from the 1970’s, where airline reservations systems were, and still are today, some of the largest “cloud” based systems, based on 100% central storage and computing coupled to thousands of dumb terminals.  Today’s cloud is a hybrid, balancing between central and device intelligence. 

One of the great advantages of cloud computing is that system administration is greatly simplified because, in principle, the end user is not involved at all.  Contrast this to Apple’s recent ios 5 update, where I used about 6 hours to update my devices–so we are not there yet in terms of zero administration.

So how does this relate to nanosynchronization?  Here, the advantages are much more important than the holy grail of zero administration.  Conceptually, the cloud enables centralized processing  and correlations between different sets of data.  Now imagine if all data is timestamped and geotagged, new possibilities arise…including all kinds of “Big Brother” scenarios. But don’t let that scare you, the credit card companies already know a lot more about you did last night that you might care to think about. 

In the big brother scenario above, American Express can see that you bought a few beers at the Crazy Z bar, and at the same time, so did Cheryl Scarecrow.   Now what they could conclude from this, you can leave up to the imagination.

Now let’s move this way of thinking to the scientific domain.  If all instrument data in the world were time stamped (with nanosecond precision, of course), when that data is uploaded to the cloud, you can make all kinds of fascinating correlations between the data.

All instrument systems, as well as consumer devices such as smart phones, digital cameras, smart cars, etc. would be logging time stamped data independent of each of, with no master plan in mind. Now assuming an explosion occured, a variety of devices in the vicinity would log either light, video, sound, vibration, electromagnetic and other kinds of signals. If all this nanosynched data were uploaded to a cloud, a central program could be used to make correlation measurements between the different signals, thus making source location and detailed analysis possible. 

I call this an “ad hoc” measurement system, because nobody planned it. But because all data had the hooks (i.e. precision time stamps and geo tags) this data suddenly becomes extremely valuable and can be configured as part of a distributed measurement system, after the fact!

Again, there is nothing new in this.  Sherlock Holmes has done the same thing for years.  Gather evidence, try to “time stamp” it to figure out what happened when and where.

Nanosync just makes his life a lot easier.

Advertisements
This entry was posted in GPS, Instrumentation, Timing and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s