In 1833, Charles Babbage, a mathematics professor within the Cambridge University in Great Britain, proposed a device called an "analythical engine", that he tried for the rest of his life to build but he couldn't because of technical problems beyond solution (at the time). Babbage's machine contained a input device for pierced cards, a storage unit for 1000 strings by 50 words each, a arithmetical unit (mill) capable of computing a sum/difference per second, a multiplication per minute, automatic printing and sequential program control. It would have had 20-digit precision. This was the prototype of the computer, 100 years before it's time. In Babbage's work, he was assisted by Ada Augusta Byron, who is said to be the first programmer as she corrected some of his errors. Starting with 1937, the american teacher Howard Aiken started working at an automated computing machine that would combine the time's technology with Hollerith coded pierce cards. The project, completed in 1944, was called "Mark I". It was an electro-mechanical machine (the computations were controlled by electro-magnethical relays and mechanical counters). The first ever electronical computer was made by John Vincent Atanasoff, teacher of mathematics and physics at Iowa State College, and Clifford Berry 1937-1938. This device used electronic tubes for storage, arithmetical and logical functions. At the beginning of the 1940, based on this achievement, ENIAC was built, 18000 electronical tubes, able to accomplish 300 multiplications per second, 300 times faster than any other device of the age. The instructions weren't stored internally, but fed thru external cards and commuters. In the mid 40's, John von Neumann, a mathematical genius, wrote a paper that suggested that all computing machines should use the binary numbering system and that data and instructions should be built in the machine. The first machine to comply with these needs was the EDSAC (1949), built at the Cambridge University.
Computers have the role of automate data processors. The operations they can perform on data are limited (at least in theory) by the necessity of complete and unambiguous specification of the process. In practice, these requirements are difficult to satisfy if sometimes impossible. In other cases, the limiting factor is the capabilities of the equipment, although advances are taking place that tend to push the limits further.