March 20, 2014

Get Ready For A Digitized Future

Essay : [Get Ready For A Digitized Future]

English Essay on "Get Ready For A Digitized Future"

Get Ready For A Digitized Future

Ten years from now, computers will be at home in our pockets and inside our television, coffee makers, and almost everywhere else as well. They will pack vastly more computing muscle into tinier, lighter packages.
If the most optimistic computer scientists are correct, tomorrow's shirt pocket computer could hold a billion bytes (equivalent to 2,000 books) in its memory and run at SO million times the speed of today's fastest personal computers. We have no idea what to do with all that computing power. We doubt that anyone else knows, either. Even the lowest estimates of the advances coming in the next decade will give tomorrow's computers a level of utility and convenience we can only imagine today.
To put it succinctly, the future of computing is "Convergence and Digital Everything".
Convergence is the grand coming together of technologies that used to be separate, such as computers and telephones. Already, the cheapest way to make a long distance telephone call is to fire up your computer and log on to the Interment. Several companies provide telephone style headsets for personal computers, together with the software needed to operate them. At this early stage of development, the sound quality is not yet up to that of a normal telephone, but the price cannot be beat: For about a half a cent per minute, you can use this system to converse over the Internet with anyone in the world London, Paris, Moscow, New Delhi.
But this is just the beginning of digital convergence. For years, most of us have had their different sets of wires and cables entering our homes and offices: one for electricity, one for conversation or computer data, 'and one for news and entertainment.(4essay.blogspot.com) Recently some electric utilities have added a fourth set of lines that allow them to check our meter readings' remotely. When all of these signals are digitized, it becomes possible to carry TV pictures on the telephone wires, computer data on the TV cable, or both of term on the electric utility's meter checking lines. That's convergence.
"Digital Everything" is an even simpler notion. Computers are becoming so small, powerful, and cheap that soon almost any object more complex than pottery will be equipped with its own brain. Lights will adjust themselves to illuminate your book or keep glare off the CV (computer/television) screen. Intruder alarms will know enough not to call the police just because you left your keys on the dresser. Toasters will, learn whether you like your English muffins .lightly browned or charred beyond recognition.
Let us take a closer look at that smart toaster and its .smarter companion appliances. Imagine a stove that arrives preprogrammed with all the recipes from your favorite restaurant or from a classic cook book 1ike The Joy of Cooking. It will also have a ROM card reader to let you add your own recipes. Just tell the stove what you want for dinner (like most computers, it will understand spoken instructions), and it will display a list of ingredients on its flat panel screen.
The smart stove will announce when the skillet is hot enough to start the stir fry vegetables, prompt you when the pasta is al dente, and give fair warning when the next step is required. It will sense when the soup is beginning to boil and automatically reduce the heat to a slow simmer. It will schedule all of your meal's courses to be ready, perfectly, at the time you want to serve it. Over time, it will also remember how all the members of the household like their food cooked rare, medium, or well done. The smart stove will no doubt have many other "intelligent" functions that have not yet occurred to us.
Fifteen year from now, product designers will still be figuring out startling ways to use the new intelligence of everyday appliances. No one of these innovations will change our lives, but as the artifacts around us gradually learn to accommodate our individual needs, the world will become a friendlier, more convenient place in which to live.
In 15 years, computer chips will be about 15,000 times more potent than the processors that power today's cutting edge personal computers. Tomorrow's typical desktop computer will finish in an hour a task that today would keep our most powerful desktop computers running. 24 hours a day for two years.
How? For one thing, engineers will be able to pack more circuit elements more efficiently on tiny chips. This allows data to move between the circuit elements faster.
A computer program consists of an enormous list of minute steps; early computer chips processed instructions one at a time until the program was complete. It was a lot like trying to move the entire population of New York City thorough a single subway turnstile.
In recent years, computer engineers have worked out ways around that bottleneck. "Pipeline" processors move several instructions thorough the system at once, opening more turnstiles. "Superscalar" processors can perform several instructions at once, in effect stuffing several people thorough each turnstile at the same time. Both processing techniques multiply the machine's speed. Current processors carry out there to six instructions at one time. In 15 years, the number is likely to be several dozen. These incremental advances alone will make tomorrow's computers several hundred times more powerful.
We are less optimistic about parallel processing, another strategy from which researchers have long expected much greater advances in computing speed. This technique aims to break a problem into many smaller tasks, perform each one simultaneously on its own processor, and then recombine the results of the individual computations into a single answer. In theory, this should be the ultimate upgrade.
However, distributing each program among many processors has proved to be almost as hard as passing one of our New Yorkers through several turnstiles at once.(4essay.blogspot.com) It is difficult enough to separate most computing problems into easy to process fragments, and that is only the first hurdle that programmers face. Distributing those many parts to individual processors, keeping the sub computers in step with each other, and then reassembling their answers into one grand result has proved all but impossible, save for those few specialized chores that lend themselves to subdivision. These problems will not be solved until someone achieves the kind of conceptual break thorough whose appearance no one can predict. We suspect that programmers will still be struggling with them 15 years from now.

No comments:

Post a Comment