ilkka_nissila Online Upload & Sell: Off
|
It may be the case in corporations and governments. In my surroundings, computing is getting more and more diverse, with many more devices that can do powerful computing locally. Networks are great for searching for information and messaging, but accessing centralized computing apart from large-scale facilities such as google is still a pain and people try to avoid it.
If I compare today with 1990s, most people at my university back then were using central computers and terminals in a unix-based environment as well as Windows-based desktops. They were all centrally managed (apart from home computers). As time has gone by, today basically every student uses a laptop to do their reports, calculations, programming etc. and a mobile for communication. At labs the researchers have a bunch of computers on their desk (typically a laptop and PC), actual experimental labs have additional computers which might not even be online (real time processes don't like interruptions by virus checking software etc.).
For actual computation, modeling, data analysis, visualization, I think most people prefer to do that locally because of control (as well as speed of data access). Heavy computing may be done on a cluster but the thing with those is that you never know how much other load there is and when you will get the results. So some people will go into great lengths to write GPU code and write their optimized algorithms to be able to calculate their stuff locally without relying on resources elsewhere. Real time data analysis locally, that's what many people in my field want and strive for. In big data analysis, things might be different.
In academia people move a lot and they take their data and code with them, so that they can work while in a conference or while working abroad. Accessing computational resources across the Atlantic is often slow and unreliable and many universities don't even allow their centralized computers to be accessed off site because of security reasons. So it's just easier to put things on a laptop or a desktop computer and run it there where possible.
Nowadays you have computers in cameras, in phones, tablets, TVs, cars, even a washing machine might have a computer of a sort. My car has a radar which is monitored by a computer and the data is analyzed to detect imminent collision and the algorithm analyses the data in real time and warns the driver of the risk and if the driver doesn't react it will apply the brakes independently. This cannot work by sending radar data to a central computer wirelessly and then wait for a central computer to analyse the data when the reaction has to happen in a split second. Networks are great for accessing databases and information and communicating but for interactive work which requires computation, or real time processing they are too unreliable and slow.
I am sure that Lightroom CC (the cloud based app) doesn't do the basic computations that are required in interactive image editing in the cloud but on the mobile device itself. I would guess it edits the preview and shows the edits on that on the mobile and later updates the preview image from the raw data when there is time to transfer the image. The cloud can do stuff like AI based image search but this is an example of both distributed and centralized computing. It's not either or. If everyone really moved to transfer their high resolution images across wireless networks (mobile broadband) I think this would create a huge increase in energy consumption and wouldn't be feasible in the end. How much coal do you want to burn to be able to access any data anywhere witha device that fits in a pocket? I don't think this development is healthy.
|