![]() |
|
(Image retrieved from: http://www.globalgrind.com)
Wikipedia defines Mobile Computing as “Human Computer Interaction by which the computer is expected to be transported during normal usage”. While the PC started taking computing machines to everyday consumer, mobile computing has really infiltrated average consumer market and revolutionized computing as a whole. Mobile computing is spreading faster than any other consumer technology in history. It has opened up new possibilities for human interaction by enabling the continuation of the computing experience regardless of geographical location. Smart phones have revolutionized not just computing but society as a whole since it’s ability to connect people in a very convenient and personal way.
Mobile computing is mainly delivered at present via smart phones and tablets; however it applies equally to PDAs, in vehicle computers, electronic readers, wearable computers etc…At present the mobile computing market is divided in to 3 main platforms namely Apple iOS, Google Android and Microsoft Windows.
Innovation in mobile computing can be categorised in to hardware, software and process. Hardware innovations can be seen around areas such as device power management, touch screens, material design for portability etc… An everyday smart watch at present is deemed to packs more computing power than the super computers that took Apollo 11 to the moon. This feat is mainly due to innovations in engineering practices such as materials, electronics and design powered by sheer competitive nature of the market. Devices on the brink of mass scale usage such as Google Glass and Agent smart watch will continue to expand the boundaries of mobile hardware technologies.
(Google glass Image retrieved by: http://www.digitaltrends.com/wp-content/uploads/2013/05/google-glass-macro.jpg)
Although initial focus on mobile computing was mainly around its hardware, an increasing amount of attention is give to Software. Whole new software areas like mobile apps, location aware programming, and inter-device communication are developing every day thanks to the population of mobile computing. The open nature of the platforms enabling anyone with right skills to build and distribute software for mobile devices has feed the exponential adaptation of mobile computing among general public.
Another aspect of mobile computing is its contributions to engineering and management practices such as industrial design, supply chain management and mass scale production/distribution. Global companies like Apple or Samsung are heavily dependent on a highly optimized and synchronized set of engineering and management processes to churn out quality devices in millions each day.
Mobile computing posses its own engineering and social challenges like security, privacy, inconsistent connectivity, potential health hazards and quality of interaction. Regardless of these challenges mobile computing is expected to play a major role in the future of computing.
|
Monthly Archives: July 2014
Internet and age of PC
Personal computers have been around here for almost 40 years and have changed the way people live in many different ways. Now we live in a society where a PC is a very common, affordable and very useful instrument that we use to carry out day to day tasks. Before 1950 the computer was considered as an instrument which can only used by highly educated people like scientists and engineers. Then computers were used for commercial purposes too. By 1955, there were only 250 computers throughout the world and in mid 1980’s 1 million personal computers had been sold.
The following graph illustrates the growth of personal computer sales from 1980 to 1984.It clearly shows the popularity of PC usage has been increased.
![]() |
|
(Image retrieved from http://www.contactandcoil.com/tag/standards/)
There were handful of reasons for this,
· The size of the computer was reduced due to the use of transistors and IC (semiconductors).
· The performance and capabilities of the pc increased dramatically.
· With the capabilities of PCs getting increased, the reliability of the machine and the data it outputs were increased too.
· With all these improvements one of the main reason for the popularity of computers among general public was the decrease in its price. With new semiconductor technology manufacturers were able to make more powerful computers for a much lower price.
As the computers become an essential part of peoples’ lives, manufactures started to invest more and more in research for better technologies. As a result things like microprocessors were invented and these inventions created new pathways to other technologies like mobile phones, PDAs, embedded systems. In 2008 worlds first processor consisting of 1 billion transistors (Intel Itanium) was introduced and this is considered as a milestone of microprocessor manufacturing because it indicates an upper boundary for the development of microprocessors using conventional technologies. Please read the post in the following link to know more about the latest trends in
|
Internet and World Wide Web
The predecessor of the Internet was a network called Arpanet. It was the first of its kind to use packet switching technology to transfer data. It was designed in 1969 and used 2 computers in Stanford and UCLA universities as hosts. As with this initial step, others who were interested with networking of computers began to construct other facilities like email, eBooks and web forums. But most of these activities were limited only to scientists and engineers and they weren’t used by general public until the World Wide Web emerges. In 1983 TCP/IP protocol was introduced to the Arpanet and this expanded the amount of devices that can be connected to the network. In 1984 DNS (Domain name system) was created. It’s the system that converts the numerical address of a webpage to a more human friendly humanoid phase. The importance of the IP address is that it’slike the name given to the device which is connected to the network and this helps the device to identify itself and communicate with other devices. In 1990, Tim burners Lee who proposed the World Wide Web and designed it created the World Wide Web .He introduced the current standards that used in internet like the HTTP (Hyper Text Transfer Protocol), HTML (Hyper Text Mark up Language) and URL (Universal Resource Locator).The first webpage was created in 1991 and It consists of an introduction for the world wide web.
Digital era of computers(the saga of computers,part 2)
If you haven’t read my previous post about the inception of computer.Please read it from here.
![]() |
| replica of the first electro-mechanical computer, Image retrieved by www.wikipedia.org
World’s first electro- mechanical was Z1 which was designed by Konrad Zuse. These were built using Vacuum tubes that can act as a switch in electrical circuits. Z1 is the first computer that used Boolean logic for problem solving. Besides that it was the first computer to use floating point numbers in calculations. Although Z1 used many mechanical components, its design had all the components of a modern digital computer like a control unit, memory unit, and input output devices. Although its outputs weren’t highly reliable .These computer series (Z1, Z2 and Z3) helped to develop most of the theories about computing that are used nowadays. The reason for the unreliability of output data was due to the huge number of mechanical components it had. These mechanical parts mostly consist of metal plates and due to the stresses that occur inside these when in operation, the machine wasn’t able to carry out given calculations precisely.
During World War 1 and 2 technological advances was increased like never before because scientists were encouraged to invent new technologies which helped some nations to win war and rise against other nations. Lots of technologies we are familiar nowadays were developed during that period including computer technology. Alan Turing, the famous mathematician who is considered as the father of modern computing laid the mathematical foundations for the digital computers. He theorized a theoretical machine called “Turing machine” which is the grandfather of modern computers, mobile phones embedded systems and almost any digital device you can think of.
During this period scientists were able to design and construct special purpose computers. They were huge machines which consumed lots of power and human effort to operate but this simpler machines were the grandfathers of modern computers. They were called special purpose since they were only built to carry out a special task. Colossus computer is an example for this type of computers. It was built by British military to decode the secret “Enigma” code which was used in their communications. It was considered as unbreakable but with the help of his machine British army was able to read all German messages and this machine helped to end the 2nd world war quickly saving many valuable lives.
Picture of the Colossus computer.
These computers architecture was different to computers now a days. There wasn’t any internal memory for the computer and everything was programmed and programming this computer meant configuring it’s hardware according to the programme they are executing. This was very tedious and time consuming task. As you can see single computer needed a team of operators and only highly trained professionals like engineers or scientists were able to operate it.
In this time computers were expensive machines due to the high price of it’s hardware ,the transistors were not available and everything was done by using vacuum tubes. As new technologies emerges scientists realised the potential of computer and they built more general purpose machines which could be programmed using softwares and carry out variety of programs .A computer called SSEM was the world’s 1st stored program computer which was built in 1948.It was a milestone in history of computer science. ENIAC was the 1st general purpose computer. UNIVAC 1 became the 1stcommercial computer which can be programmed according to the requirements of it’s client.(You can view more details about UNIVAC from here.)
In this time people were more focused on the hardware of the computer since they weren’t cheap and only limited number of manufacturer were there to buy them. With the progress of system design people begun to pay attention to software as well. In the beginning programming was done by using machine language and it was a very difficult task .then people began to use word phrases (mnemonics )to represent a piece of machine code and assembly language was introduced. it made programming much easier but it was still platform dependent and programmers had to programme different types of computers using different assembly languages due to this. They were called low level programming languages because they were just a more human readable version of machine language. So to solve this scientists invented more advanced and abstract programming languages like Fortran , COBOL and C. They were platform independent and gave much freedom to the programmer to write programs to a computer without knowing it’s architecture. They are called high level languages. These languages made programming a computer much easier but still assembly is very useful since it gives more control of a computer to the programmer and no high level language can directly access hardware of a computer. These high level languages use a special program called compilers to translate the program a programmer wrote to machine code that a computer understand.
In the next post I’m hoping to write the internet era of the computers. Reading the evolution of computers is fun and enjoyable. It may sound like rubbish but there are lots of things to learn from past and we can imagine how the future generation will think about the computers that we use today and I think it won’t be very different from the picture we have in our minds towards the technology about 50 or 60 years ago.
|











