The Commercialization of Virtual Reality Computers and Technology

The rate or progress for Virtual Reality is indeed stunning. So too is the commercialization of products, services and potential applications. For instance researchers, teachers, politicians and innovators are always stacked with projects that need a Virtual Reality Technicians skill sets to the audience, customer or funding groups better visualize.

Of course as things move forward much faster standardization is also coming to a head. Many VR consultants and leading edge thinkers speak of granularity, verbs and software programming methodologies, but none are completely certain that is how things will work in the future in VR, yet we all see this is the push. It is amazing all the competing VR standards, theories and directions the market place is working with now. It is pretty insightful and although 4 years old now is pretty much on the money.

One of the reasons I make this statement is because someone mentioned in an article about the slicing of food on the kitchen in virtual reality and the sound of the knife in EOX (basically that is surround-a-sound) and the granularity (individual animated pixels) dividing the object and yes all that is good for action sequences, explosions, car crashes in video games, VR Life II type things and such, or simulators for training, but there is an issue with combinations of multiple scenarios, with AI and the size of the program, current bandwidth, storage devices, etc.

We can talk all about the future as the VR Technicians see it and we can discuss all the applications for Government, Military, Business, Healthcare, Earth Sciences, Space, Training, Psychology, Sports, Sex, Politics, Distance Learning, Sales or V-travel and yet in the end a standard is needed so that the theories, methods and philosophies can all be on the same page to move the ball down the field and attract the capital needed to bring the VR World into a blurred reality with the real world and to make a profit in doing so. ROI is what the real world is about.

We must never forget that as we create the future Utopia in Virtual Reality. I certainly hope this article is of interest and that is has propelled thought. The goal is simple; to help you in your quest to be the best in 2007. I thank you for reading my many articles on diverse subjects, which interest you.

History of the Computer – Computers and Technology

The volume and use of computers in the world are so great, they have become difficult to ignore anymore. Computers appear to us in so many ways that many times, we fail to see them as they actually are. People associated with a computer when they purchased their morning coffee at the vending machine. As they drove themselves to work, the traffic lights that so often hampered us are controlled by computers in an attempt to speed the journey. Accept it or not, the computer has invaded our life.

The origins and roots of computers started out as many other inventions and technologies have in the past. They evolved from a relatively simple idea or plan designed to help perform functions easier and quicker. The first basic type of computers were designed to do just that; compute!. They performed basic math functions such as multiplication and division and displayed the results in a variety of methods. Some computers displayed results in a binary representation of electronic lamps. Binary denotes using only ones and zeros thus, lit lamps represented ones and unlit lamps represented zeros. The irony of this is that people needed to perform another mathematical function to translate binary to decimal to make it readable to the user.

One of the first computers was called ENIAC. It was a huge, monstrous size nearly that of a standard railroad car. It contained electronic tubes, heavy gauge wiring, angle-iron, and knife switches just to name a few of the components. It has become difficult to believe that computers have evolved into suitcase sized micro-computers of the 1990’s.

Computers eventually evolved into less archaic looking devices near the end of the 1960’s. Their size had been reduced to that of a small automobile and they were processing segments of information at faster rates than older models. Most computers at this time were termed “mainframes” due to the fact that many computers were linked together to perform a given function. The primary user of these types of computers were military agencies and large corporations such as Bell, AT&T, General Electric, and Boeing. Organizations such as these had the funds to afford such technologies. However, operation of these computers required extensive intelligence and manpower resources. The average person could not have fathomed trying to operate and use these million dollar processors.

The United States was attributed the title of pioneering the computer. It was not until the early 1970’s that nations such as Japan and the United Kingdom started utilizing technology of their own for the development of the computer. This resulted in newer components and smaller sized computers. The use and operation of computers had developed into a form that people of average intelligence could handle and manipulate without to much ado. When the economies of other nations started to compete with the United States, the computer industry expanded at a great rate. Prices dropped dramatically and computers became more affordable to the average household.

Like the invention of the wheel, the computer is here to stay.The operation and use of computers in our present era of the 1990’s has become so easy and simple that perhaps we may have taken too much for granted. Almost everything of use in society requires some form of training or education. Many people say that the predecessor to the computer was the typewriter. The typewriter definitely required training and experience in order to operate it at a usable and efficient level. Children are being taught basic computer skills in the classroom in order to prepare them for the future evolution of the computer age.

The history of computers started out about 2000 years ago, at the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to programming rules memorized by the user, all regular arithmetic problems can be done. Another important invention around the same time was the Astrolabe, used for navigation.

Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and, after changing some things around, multiply. Leibnitz invented a special stopped gear mechanism for introducing the addend digits, and this is still being used.

The prototypes made by Pascal and Leibnitz were not used in many places, and considered weird until a little more than a century later, when Thomas of Colmar (A.K.A. Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide. A lot of improved desktop calculators by many inventors followed, so that by about 1890, the range of improvements included: Accumulation of partial results, storage and automatic reentry of past results (A memory function), and printing of the results. Each of these required manual installation. These improvements were mainly made for commercial users, and not for the needs of science.

While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage (of which the computer store “Babbages” is named), a mathematics professor. In 1812, Babbage realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to do these automatically. He began to design an automatic mechanical calculating machine, which he called a difference engine. By 1822, he had a working model to demonstrate. Financial help from the British Government was attained and Babbage started fabrication of a difference engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program.

The difference engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea; the construction of what would now be called a general purpose, fully program-controlled, automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this couldn’t be appreciated until a full century later.

The plans for this engine required an identical decimal computer operating on numbers of 50 decimal digits (or words) and having a storage capacity (memory) of 1,000 such digits. The built-in operations were supposed to include everything that a modern general – purpose computer would need, even the all important Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed.

As people can see, it took quite a large amount of intelligence and fortitude to come to the 1990’s style and use of computers. People have assumed that computers are a natural development in society and take them for granted. Just as people have learned to drive an automobile, it also takes skill and learning to utilize a computer.

Computers in society have become difficult to understand. Exactly what they consisted of and what actions they performed were highly dependent upon the type of computer. To say a person had a typical computer doesn’t necessarily narrow down just what the capabilities of that computer was. Computer styles and types covered so many different functions and actions, that it was difficult to name them all. The original computers of the 1940’s were easy to define their purpose when they were first invented. They primarily performed mathematical functions many times faster than any person could have calculated. However, the evolution of the computer had created many styles and types that were greatly dependent on a well defined purpose.

The computers of the 1990’s roughly fell into three groups consisting of mainframes, networking units, and personal computers. Mainframe computers were extremely large sized modules and had the capabilities of processing and storing massive amounts of data in the form of numbers and words. Mainframes were the first types of computers developed in the 1940’s. Users of these types of computers ranged from banking firms, large corporations and government agencies. They usually were very expensive in cost but designed to last at least five to ten years. They also required well educated and experienced manpower to be operated and maintained. Larry Wulforst, in his book Breakthrough to the Computer Age, describes the old mainframes of the 1940’s compared to those of the 1990’s by speculating, “…the contrast to the sound of the sputtering motor powering the first flights of the Wright Brothers at Kitty Hawk and the roar of the mighty engines on a Cape Canaveral launching pad”. End of part one.

Computers and Technology in the Academic Learning of Young Children

In today’s world, computers have become a familiar fixture in the daily lives of children and adolescents, offering a wide range of learning and entertainment tools.

While surveys have indicated that boys are heavier users of computer games and visit websites more often than girls, no gender differences have emerged for chatting, using e-mail, or doing schoolwork on the computer. Additionally, both teenage boys and girls have expressed equal confidence in their computer skills.

On average, parents estimate that their school-age children and adolescents use the computer approximately 1.5 hours per day. Computers, electronic games and toys, and technology in general largely influence and affect the lives of children. Technology has thus proven to be largely capable of enriching the lives of children, especially in the areas of academic and social learning and development.

Computers for instance, can have rich cognitive and social benefits. Children as young as 3 years of age like computer activities and are capable of typing in simple commands on a standard keyboard. Additionally, in today’s classrooms, small groups often gather around the machine, and children more often collaborate while working with the computer instead of using traditional paper and pencil methods.

As soon as children start to become literate (being able to read and write), they can make use of the computer for word processing. This lets them write without struggling with handwriting, and they can revise text meanings and style, and check their spelling. As a result, children tend to worry less about making mistakes, and their written products end up longer and of higher quality.

Specially designed computer languages introduce children to programming skills. With the support of adults, children’s efforts with computer programming can lead to improved concept formation, problem solving and creativity. Furthermore, as children must detect errors in their programs to make them work, programming will assist them in reflecting on their though processes. This will inevitably lead to gains in meta-cognitive knowledge and self-regulation. Also, while programming, children are particularly likely to collaborate, persist in the face of challenge, and demonstrate positive attitudes toward learning. This is consistent with Vygotsky’s theory in showing that social interaction supporting children’s mastery of challenging computer tasks is capable of fostering a wide range of higher cognitive processes.

Children and adolescents spend much time using home computers purely for entertainment purposes. Many computer games emphasize speed and action in sometimes violent plots where children advance by shooting at and evading enemies. Children additionally play more complex exploratory and adventure games with themes of conquest and aggression and sports games. These include football and soccer. Children likewise enjoy simulation games, for example creating and caring for virtual pets (which require attention to “stay alive”), entering virtual realities (such as an ecosystem where the player mutates plants and animals into new species), and role-playing characters.

Speed-and-action computer games cultivate attentional and spatial skills in both boys and girls. However, while offering opportunities for learning, extensive playing of simulation games might risk blurring the distinction between virtual and real life.

Many youths use the computer to communicate. While using the internet causes some potential for causing disengagement from real life, it does hold much value in letting users acquire computer skills, information, and enabling communication.