Summary
Wherein I recount my experience of using a thin client over a DSL.
Advertisement
Director Morgan Spurlock got the idea for the movie SuperSize Me! while sitting on a sofa after a Thanksgiving dinner, stuffed, unable to move. The idea seemed simple enough: Go on a McDonald's diet for 30 days straight eat nothing but McDonald's food, and document each day what happens. He did just that, and while I'm no fan of the movie, you can see for yourself the effects.
Starting this week, I will go on a reverse Spurlock diet: For a week, I will use a thin client to do my very technical work of producing software and other content, instead of using a PC or a laptop. While using thin clients over a high-bandwidth corporate network is not a novelty, I have only a plain DSL connection at my disposal. My desktop will reside on a remote server across the country, and I will be connected to that desktop via a DSL. Because some software I work on cannot be moved to that remote desktop, I will still use my laptop during the week. But for all other work, I will use the thin client.
My motivation for experiencing how thin clients work is to find out if thin clients can help solve what I perceive is a crisis on user desktops. For the past three years, I've been running a company that provides software and services to small businesses. Most of our customers cannot afford a dedicated IT manager to look after their desktops, which are almost exclusively Windows-based. As a result, most customers experience the following desktop lifecycle:
Clean Slate. The customer purchases a new PC. When he receives it, he calls the local computer guy to set the PC up. He then calls us to help walk them through installing our software, which they use to run the business.
Gradual Decline. The way most computer guys set up a new PC is to provide the user access to the administrative account. The reason is simple: Leaving the admin account as the default allows the user to install any software without having to call the computer guy. Since most users don't understand, or care about, the difference between administrative and less privileged accounts, they don't understand the concept of switching between accounts. They also tend to forget account passwords, especially in environments with high employee turnover.
Crisis. Daily use of a network-connected computer with full administrative privileges leads to the crisis stage: Over time, viruses, spyware, and effects of the business owner's children playing on the computer take hold, the OS slows to crawl, eventually preventing critical business functions from being performed at crucial moments, such as making a sale or providing a quote.
In the moment of crisis, the only contact that business owner can rely on is the software vendor whose product they use to run the business. After all, that's the only piece of software that really needs to work, and in the crisis stage, that software also malfunctions. With luck, and after a lengthy and expensive remote crisis management session on the phone or on the Web, we can either restore the user's desktop to some stage of normalcy, or have to suggest to the user to just wipe clean his OS, and install and start over. The circle now comes full, with the PC returned to a more or less clean state.
That devil's circle has led me to believe that software is now a service business, and that software vendors can provide greater value with service than with their software products, simply because desktop crisis management service is what users need the most. This presents an enormous challenge to ISVs delivering mission-critical software, and makes such software practically unavailable to small enterprises unable to afford huge consulting and support fees. But observing this cycle led me to believe that there must be a way to break out of this circle, and that crisis can be turned into an opportunity.
I got a glimpse of that better way when I visited an acquaintance who is a Sun exec. He had neither a PC nor a laptop in his house, and instead was conducting his entire business via a thin client, a SunRay, residing strategically on his kitchen counter. The most interesting aspect was not the SunRay itself, but the kind of performance he was able to obtain from it via a simple DSL line.
I contacted the Sun folks in charge of the SunRays, and was promptly provided with the device to try out, along with an account on Sun's prototype desktop grid facility. The following is a brief diary of the first few days of my living thin. I will add to it in the coming days.
Day 1.
I received the SunRay via the local Sun office in Southern California. The thing comes in a single box, containing the SunRay itself, which is about the size of a largish book, a keyboard and a mouse. This version of the SunRay, the 1G, does not include a display, and can instead work with any PC monitor via a 15-pin SVGA connector. Since I have an old Sun monitor lying around, which is essentially a Sony with the Sun logo on it, I will just connect the SunRay to that monitor. The keyboard and mouse are USB-based.
The back of the SunRay allows four USB inputs, an Ethernet input, an SVGA video connector, inputs for audio devices and an S-video connector for digital video. Of course, there is a slot for a power cable as well. The front of the unit has a smart card reader, a microphone input, and a headphone output. The SunRay 1G is lighter than a 12 inch Apple PowerBook.
Day 2.
I finally have time to set up the SunRay. The instructions in the box are minimal, but the Sun folks provided me with a few lines of instructions on how to connect to the Sun prototype grid facility. Apparently, they loaded network settings on the unit before shipping it to me, so this thing should just work right out of the box.
OK, so I connect the keyboard, mouse, and monitor, and plug an Ethernet cable to my Linksys DSL router and the SunRay. Then I turn on the monitor and plug the power cable into the device (the 1G doesn't have a power switch).
The monitor immediately wakes up, and displays an hour-glass like image. I guess it's indicating that it's trying to connect to the network. On the top of this hour glass, I notice a DHCP-assigned IP address, which must have been handed to the SunRay by my router. The lower end of the hour glass displays an IP address that must be the address of the network the SunRay is trying to connect to.
About 15 seconds pass. Nothing's happening. I start to wonder if the SunRay can make the connection to the remote desktop server via the Linksys router. Since the router performs network address translation, and also acts as a firewall, can the SunRay navigate itself through that NAT layer?
Another 5 seconds pass. Then I can finally see the hourglass on the screen change. After about another 5 seconds, the screen turns blank, and then immediately switches to a Solaris login screen. Whoa, so I'm connected to the Sun network.
I enter the login name and password the Sun folks had assigned to me, and click on the login button. So far, the mouse and keyboard input feels as though I was typing on a local machine, whereas, in fact, the login screen is running on a remote server.
In a few seconds, I get a desktop that appears very much like the GNOME desktop I use every day on my laptop. It has some differences: the theme and some buttons and icons. So this is the Java Desktop System. So far not much Java, save for the neat Java logo desktop background.
I click on the Launch button on the lower task bar, and the program options menu appears. That menu is exactly the same as the GNOME launch menu, which, in turn, is very similar to the Windows Start menu. While I can now sense that the desktop is running across the network, the screen redraws are very fast.
I fire up a Web browser, which happens to be Mozilla 1.4 on this Sun desktop. I had used Mozilla on my laptop until I could get Firefox working, and Mozilla was always very sluggish compared to Firefox. Now, however, Mozilla seems to start up fast, taking me to the Sun home page as a default.
So far, so good: about six minutes from unpacking the device to the point of browsing the Web and being able to use the desktop.
> Why not use VNC? Seems cheaper and easier to set up. > Additionally, it allows you to collaborate with others > (though the thinclient may do so as well).
Actually, we use VNC a lot for customer support. But with a lot of people now having their Windows firewalls turned on, it's harder to get it working without a lot of fuss.
But the SunRay is totally different from VNC. VNC allows you to gain a view a remote desktop - you still need a computer to run VNC. The SunRay is not a computer, but rather a device that lets you access some remote desktop environment.
Our problem has been that no matter what software we give to our customers, they always find a way to mess their environment up. Perhaps if they did not use a computer all, but used a simple device instead, we'd all be better off.
The other nice thing with the SunRay (so far, in my experiment), is that it seems to have a really intelligent algorithm to determine what parts of the screen need redrawing, and it transmits only the required screen segments. This really works, because when you type, for instance, often the only area of the screen that changes are the characters you're typing. So typing on this device is really fast, almost comparable to a mid-range PC.
Mind you, actually going the whole way and using a thin client is a bit unrealistic for me, and I was thinking of it in slightly different terms, more like the old timeshare systems. I'm not wholly comfortable with computing-as-a-utility.
This provides the advantages of centrally-managed application suits, operating system images, and home directories, while allowing users to run all sorts of commercial software from Adobe, Microsoft, etc.
We are back to the mainframe (read server) environment we never should have left in the first place. There never was a need for all the seperate CPU's, storage, operating systems and duplicate software (except for the hardware and software companies pushing the junk). It took along time to rediscover the wheel. People will look back on the 1980's and 1990's and say what were they thinking. It all could have been so much easier. You talk like the SunRay is something new. Intelligent terminals were available along time ago. I at least give Sun credit for pushing the idea that the network is the computer. You are traveling a well worn path. It is nothing new.
Maybe so, but you need to remember that the current prevalence of full desktop machines is a result of the rise of computing outside of big business where the use of mainframes used to be prevailant. It then spread into those environments because individual machines were cheaper, so if one died, it didn't cost much to get a new one. However, the cost of buying a new mainframe was and still is prohibitive.
I also disagree that this is the same as the old timeshare systems that used to be used. It's not quite reinventing the wheel: it's more like the wheel was invented, then the cog, and now we're looking at a cross between the two that works better than either in many circumstances. The re-emergence of thin clients will come about precisely for the same reason that Google as a search engine works: it's backed by a bunch of networked, cheap, and interchangable boxen that can be added to as the need for power increases and that can be left to die safely.
This isn't a return to the big metal of the past, but taking the lessons learned from the experiment with individual desktop machines, and combining them with the lessons learned from the old timeshare systems, and coming up with something new from the two.
Your points are well taken although I don't agree with all of them. The term mainframe is loaded. I will use the term server (application/database). Few people realize the security, reliability, and availablility that these boxes provided 25 years ago and without the need for desktop support which by some estimates runs up to 5k a year per desktop. You can capitalize hardware costs but not support personnel.
My real point though is can you imagine if even half of all the development efforts had been put towards server based computing, where we would be. The best analogy I can make is between mass transportation and cars. It is not a question of one or the other but most of the stuff that was and is being developed is junk and would have been much better done on "mainframes". A lot of the software development today is junk and won't survive either. I am amazed how many times people can be fooled by the "next great thing" (client/server, oo, weblogic etc. etc.)
> As it happens, I'd similar thoughts myself a while back on > using thin clients: > http://talideon.com/weblog/2005/04/dumb-terminals-term-svc. > cfm > > Mind you, actually going the whole way and using a thin > client is a bit unrealistic for me, and I was thinking of > it in slightly different terms, more like the old > timeshare systems. I'm not wholly comfortable with > computing-as-a-utility.
Yes, I'd be interested to find out if businesses use remote desktop on Windows to provide access to a remotely located farm of computers this way.
In a way, the SunRay I'm using now is like what your describe (I guess), expect that it doesn't even have an OS. It has some firmware, to be sure, but it's not an OS. It has code also to perform the protocol that's behind the remote display protocol the SunRay uses.
> FYI - A Mac can be used in a diskless thin-client style > using a server to store centrally-managed boot disk > image(s) and users' home directories. > > See > http://www.apple.com/server/macosx/features/netbootnetworki > nstall.html > > This provides the advantages of centrally-managed > application suits, operating system images, and home > directories, while allowing users to run all sorts of > commercial software from Adobe, Microsoft, etc.
This sounds very interesting.
I know that many BIOSes can boot an OS image via PXE (http://www.webopedia.com/TERM/P/PXE.html). Many large clusters are configured to boot nodes via PXE from a centrally managed image. That way, the cluster nodes can be loaded with the OS needed for a specific job, and a fresh OS image can be loaded after some job ran.
I don't know, or whether, PXE would work over a relatively low-bandwidth network, such as DSL. Has anyone tried that? I would think booting over a slow network might take quite a while, since large parts of the OS would have to be downloaded.
> We are back to the mainframe (read server) environment we > never should have left in the first place. There never > was a need for all the seperate CPU's, storage, operating > systems and duplicate software (except for the hardware > and software companies pushing the junk). It took along > time to rediscover the wheel. People will look back on > the 1980's and 1990's and say what were they thinking. It > all could have been so much easier. You talk like the > SunRay is something new. Intelligent terminals were > available along time ago. I at least give Sun credit for > pushing the idea that the network is the computer. You > are traveling a well worn path. It is nothing new.
I think it's easy to forget how expensive high-bandwidth, relatively low-latency networkings were just a few years ago. Most corporate LANs used 10Mb/s Ethernet as a standard just 8-10 years ago. High-performance switches used to cost a lot just 10 or so years ago. So the kinds of solutions that remote terminals and network computers offer weren't really viable, IMO, on economic grounds.
Now, there is an interesting thing with the economics of bandwidth, especially with you get to the edge of the network. For instance, there is a fairly large amount bandwidth available via broadband residential/office networks, e.g., via cable or DSL providers. We're only now beginning to make use of parts of that bandwidth. On the backbone side, there is a large network infrastucture in place with huge amounts of fiber capacity that's not being used, even with all the VOIP offerings available now.
So, it's inevitable that network providers will want to find new uses for that bandwidth, something that delivers value to customers (e.g., something customers will be willing to pay for). Just today, the WSJ had an article about DSL providers vying to offer video and TV via DSL lines. I think that desktops over IP (or display over IP) might be another kind of service those companies can offer. That may appeal to a market that really doesn't want to fiddle with a desktop OS, but instead just wants a service that works all the time.
For instance, my grandfather, who is now 82, has had a really hard time with viruses (not biological, but computer), spyware, and all that stuff, lately. He certainly would be a customer for a desktop-over-IP solution, since it would serve his needs of sending email, browsing the Web, writing short documents, etc., with minimal fuss.
> For instance, my grandfather, who is now 82, has had a > really hard time with viruses (not biological, but > computer), spyware, and all that stuff, lately. He > certainly would be a customer for a desktop-over-IP > solution, since it would serve his needs of sending email, > browsing the Web, writing short documents, etc., with > minimal fuss.
Yes, but if more people go in this direction, the virus writers will turn their attention to attacking the servers. People would still tend to surf to the same websites, answer yes when asked if they want to install a trojan horse onto their server-hosted desktop, and so on.
Isn't the difference that it is cheaper to administer these desktops in a central location, by professionals who understand how to prevent and get rid of viruses? That kind of professional service might be cheaper, but it isn't free, and I wonder how much it would cost to provide that kind of service to customers at home. I can see this making economic sense for small businesses, but I'm not sure about home users like your grandfather. What's hard to scale and drive down in price is the cost of that service.
> Isn't the difference that it is cheaper to administer > these desktops in a central location, by professionals who > understand how to prevent and get rid of viruses? That > kind of professional service might be cheaper, but it > isn't free, and I wonder how much it would cost to provide > that kind of service to customers at home. I can see this > making economic sense for small businesses, but I'm not > sure about home users like your grandfather. What's hard > to scale and drive down in price is the cost of that > service.
I don't know about prices, but last I heard this could cost less than a cup of mocha a day.
A network service provider that already controls the bandwidth going to a residence (e.g., a phone or cable company) already has a fixed cost for maintaining that connection, so offering such a service is really an additional revenue possibility for them.
On the server side, a "desktop" provider can pool resources in a smart way to create efficiences, e.g., there is no resource utilization when someone is not using his desktop, etc. Resource utilization and provisioning, in general, is a problem that "grid" or "utility" computing providers have to deal with.
> I don't know about prices, but last I heard this could > cost less than a cup of mocha a day. > > A network service provider that already controls the > bandwidth going to a residence (e.g., a phone or cable > company) already has a fixed cost for maintaining that > connection, so offering such a service is really an > additional revenue possibility for them. > > On the server side, a "desktop" provider can pool > resources in a smart way to create efficiences, e.g., > there is no resource utilization when someone is not using > his desktop, etc. Resource utilization and provisioning, > in general, is a problem that "grid" or "utility" > computing providers have to deal with.
I can see how the cost of the network and hardware resources can be made affordable. My question is about the cost of human beings providing the service to clean off viruses. It is far cheaper to share sys admins with all my neighbors than for each household to have their own, certainly, but how cheap?
> We are back to the mainframe (read server) environment we > never should have left in the first place. There never > was a need for all the seperate CPU's, storage, operating > systems and duplicate software (except for the hardware > and software companies pushing the junk).
From a pure technology point of view this may well be correct however, for most companies, the best 'technical' solution is not always the best 'commercial' solution.
The net result of centralising control of hardware and software and access to it (not to mention who decides who can run what and what software can be bought, etc.) is, for many companies, a bureaucratic nightmare. You end up with a black box department where no one outside can actually get to do anything without reams of arcane forms to fill in and frequently weeks of waiting to get the most rudimentary response.
Remember print rooms? I do. The same used to be true there. To get a print, forms had to be filled in, handed in and negotiated. Hours (or days) later you'd get back a print. If it was wrong in some way you went around the loop again. If you complained, you went to the back of the queue. Now we have photocopiers. You want a print? You get a print. It's that simple.
Decentralising hardware and software to the users was a massive leap forward. Sure there are some increased costs but they are dwarfed by the increased flexibility. In addition, the job of maintenance is returned to being a service to the rest of the company not an obstical to its workings.
> People will look back on > the 1980's and 1990's and say what were they thinking.
They were thinking how wonderful it was to have the freedom to work the way they want and not having to kowtow to a bunch of backoffice geeks who think they know better.
Whilst the idea of dumb terminals looks good on paper and has may well work in special situations, I can guarantee that in no time there'll be a user who says "we need software X to do our job better" and the response will - once again - be "We know what you need better than you do and we approve of software Y. Oh, and you've exceeded your file storage allotment and must delete some files.". And the user just will go out and get a PC to use in addition to the terminal.
V.
Flat View: This topic has 23 replies
on 2 pages
[
12
|
»
]