temp

Template 19490

Friday, March 6, 2009

INTEL A BRIEF VIEW

Intel has grown through several distinct phases. At its founding, Intel was distinguished simply by its ability to make semiconductors, and its primary products were static random access memory (SRAM) chips. Intel's business grew during the 1970s as it expanded and improved its manufacturing processes and produced a wider range of products, still dominated by various memory devices.

While Intel created the first microprocessor (Intel 4004) in 1971 and one of the first microcomputers in 1972, by the early 1980s its business was dominated by dynamic random access memory chips.
However, increased competition from Japanese semiconductor manufacturers had, by 1983, dramatically reduced the profitability of this market, and the sudden success of the IBM personal computer convinced then-CEO Grove to shift the company's focus to microprocessors, and to change fundamental aspects of that business model.
By the end of the 1980s this decision had proven successful, and Intel embarked on a 10-year period of unprecedented growth as the primary (and most profitable) hardware supplier to the PC industry.

After 2000, growth in demand for high-end microprocessors slowed and competitors garnered significant market share, initially in low-end and mid-range processors but ultimately across the product range, and Intel's dominant position was reduced.
In the early 2000s then-CEO Craig Barrett attempted to diversify the company's business beyond semiconductors, but few of these activities were ultimately successful.

In 2005, CEO Paul Otellini reorganized the company to refocus its core processor and chipset business on platforms (enterprise, digital home, digital health, and mobility) which led to the hiring of over 20,000 new employees.
In September 2006 due to falling profits, the company announced a restructuring that resulted in layoffs of 10,500 employees or about 10 percent of its workforce by July 2006. Its research lab located at Cambridge University was closed at the end of 2006.

*INTEL* COMPUTER TYCOON

Intel Corporation (NASDAQ: INTC; SEHK: 4335) is the world's largest semiconductor company and the inventor of the x86 series of microprocessors, the processors found in most personal computers.
Intel was founded on July 18, 1968 as Integrated Electronics Corporation and based in Santa Clara, California, USA. Intel also makes motherboard chipsets, network cards and ICs, flash memory, graphic chips, embedded processors, and other devices related to communications and computing.
Founded by semiconductor pioneers Robert Noyce and Gordon Moore, and widely associated with the executive leadership and vision of Andrew Grove, Intel combines advanced chip design capability with a leading-edge manufacturing capability. Originally known primarily to engineers and technologists, Intel's successful "Intel Inside" advertising campaign of the 1990s made it and its Pentium processor household names.

Intel was an early developer of SRAM and DRAM memory chips, and this represented the majority of its business until the early 1980s. While Intel created the first commercial microprocessor chip in 1971, it was not until the success of the personal computer (PC) that this became their primary business.
During the 1990s, Intel invested heavily in new microprocessor designs fostering the rapid growth of the PC industry. During this period Intel became the dominant supplier of microprocessors for PCs, and was known for aggressive and sometimes controversial tactics in defense of its market position, as well as a struggle with Microsoft for control over the direction of the PC industry.
The 2007 rankings of the world's 100 most powerful brands published by Millward Brown Optimor showed the company's brand value falling 10 places – from number 15 to number 25

Thursday, March 5, 2009

MINI COMPUTERS OR LAPTOPS

Earlier the computers used to be very bulky and they occupied a lot of space. They required very high cooling. Then after a long invention period personal computers came in. They looked like TV. Then came Laptops which revolutionized the concept of computing. They are replacing Computers at workplaces and at home.Laptops consume very less space and even have more features than a computer, so people are opting for it.


The laptops are in a great demand. Today there are lot of companies manufacturing and dealing with computers and laptops. Due to Tough competition prevailing in the market, companies are coming up with Cheap laptops.
The devices are enabled with the latest features and technologies. Some of the brands are Lenova, Acer, Sony, Toshiba, Compaq, HP etc. They are offering very good and latest services at very cheap prices.The main attraction of latest laptops is that they are very handy, consumes less space, wide-screen LCD displays, fast processors and have more storage space as well as internal memory.
The users can also attach external mouse, pen-drive and has Internet facility with the laptop. Mini laptops are also available. They are also called Palmtops or hand computers. They are very small in size and they can easily fit into the users palm. They have small screen and a squeezed keyboard.
They are not as efficient as the Latest laptop. Where as laptops are more efficient than a normal computer.Buying laptop is not an easy task, it requires lots of research and surveys. There are many companies which are providing the best services, so it requires lots of patience to search the best and cheap laptop with the latest features. The users can also search on-line. Today On-line shopping is very common and it is very popular too. The users can get the best deals and price on-line. There are lots of on-line sites from which buyers can buy laptops at amazing prices.

E MARKETING

























It is challenging to begin a home company and understand Web Marketing to make it succeed. A lot of persons dream of starting one because it offers economical and personal independence, although it is also a frightening, challenging and puzzling direction to take. If you don't know how to start, we present some hints to get you moving in a methodical, cheap and knowledgeable way. When developing an on line company and Web Marketing campaign, you need to learn in regards to what works for your company's industry and in regards to the elements and technical requirements you require to begin it and keep it running smoothly.There is a lot to study and investigate at the beginning, so get assistance to guide you through the steps in a methodical way. There is an outstanding resource for this: ittybiz.com, here you will find priceless tips on how to create a successful home company step by step. You must advertise your company and there is a simple manner to turn good at Web Marketing. On line you can find many simple to learn marketing tool kits, thus, choose the best one for your business. Only write- Web Marketing Tool Kit- on your computer and check the main results. Also, the work “Book Yourself Solid” by Michael Port, is an easy tool to learn Web Marketing, with practical exercises to determine who must be your clients, what you may give them and how to find potential ones. Research about copywriting. You don't have to turn into a best- selling novelist, but you surely have to learn how to develop an effective landing page, an attractive piece that can encourage your visitors to pay for what you are selling, because it is vital for your Web Marketing undertakings. Powerful copywrtiting asks for clarity, vigor, appeal, experience and quality and you already have all that: you deeply know your market, what moves them, what they are asking for and why they should choose you. Thus, only write- commercial copywriting tips- on your computer and check the results you get. Write a business plan, every small company considering an effective Web Marketing campaign must apply one. Ughh... ugly stuff! Not at all, keep it uncomplicated and effective. Check out Jim Horan's work “The One Page Business Plan”.

Computers and Information Technologies






Computers and Information Technologies are as ubiquitous in our lives today as the air we breathe. Computers have led to a third information revolution taking its place alongside agricultural and industrial revolutions. We see computers everywhere, in Desktop PC's, PDA's, cars, washing machines, ATM's - everywhere. The resulting multiplication of humankind's intellectual strength has affected our daily lives and also changed the way in which we search for new knowledge and resources. At a steady rate of around 11%, the last decade has seen an unprecedented boom in the computer industry - and nearly everyone is being affected by the phenomenon. This unprecedented growth rate has led to amazing technological progress since the inception of electronic computing in late 1940's. By way of comparison, had the transport industry kept pace with the computer industry, today we could travel from coast to coast in about few seconds for roughly few pennies. Of course in almost any case, growth can lead to growing pains. In the case of the computer industry, the expectation that nearly everyone has a computer leads to the need for nearly everyone to own a computer.
The Internet as a Computer Sales Driver
The advent of the internet has certainly spurred on the success of the computer industry. The increase in the number of internet users worldwide has grown from 274 million in 1999 to 605 million in 2002 with a staggering growth of 119 percent. In the US alone, the number of Internet users has increased by 53 million (16 percent) with a penetration of 52 percent. The percentage of total population in the world with internet access has increased from 7 percent 1999 to 9 percent in 2002.
Explosive Growth In Computer System Sales
During the past 5-6 years, there has been explosive growth in the number of PC buyers - not only in developed countries like the United States and Europe but also in the Asia Pacific and other regions of the world. In fact, the Chinese computer industry grew a whopping 38 percent in 2004 almost dwarfing our growth rates of the United States , which has been 35.2 percent over the period of 1990-2000. Worldwide growth in the sales of Personal Computers has increased from 394 million in 1999 to 550 million in 2002. Total numbers of PC's in US are 178 million which accounts for 62 percent penetration in US households. As an example, in the United States 625 people have PC per 1000 inhabitants. This high ratio is in contrast to the percentage of the total world population

COMPUTER EDUCATION


Computers are no doubt the most radical invention of mankind so far as it has revolutionized the way we live. It has touched almost every aspect of our lives from performing simple calculations to unlimited access to the boundary less world through internet. The most radical areas of computer sciences include Artificial intelligence the study of system’s spontaneous reactions and communication sciences. The computer Sciences has touched our lives so radically that it has changed the way we shop, learn and interact.Computer based learning is where a computer serves as a tool to gain education. The computer education programs offer tremendous flexibility to its students as it has no time restrictions and can fit into your time schedules according to your convenience. It enables the Individuals to improve their qualification along with continuing their jobs due to no specific timings. It also provides the advantage of saving traveling effort and cost as you have the access to all coaching materials right at your place. It also allows you to learn new concepts according to your own pace and understanding. In case of any disability or health issues you can enjoy the access to the learning without facing any difficulty. In computer education program now you don’t have to miss anything while you are on an official tour to another location.‘Computer Sciences’ is an immensely diverse field based basically on the study of the theoretical foundations of information and computation and their implementation and application in computer systems. Computer education has dynamic fields like computer graphics dealing with developing various 2D and 3D images and further moving images while the other field deals with, computational problems still others focus on the challenges in implementing computations. Programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems. A further subfield, human-computer interaction, focuses on the challenges in making computers and computations useful, usable and universally accessible to people.Computers have a great impact on the other field of studies and provide a great aid like in areas like physics, linguistics and most importantly artificial intelligence. So when you are considering making it into computers your choices are numerous like Mathematical foundations, Artificial intelligence, Algorithms and data structures, Software engineering, networking, graphic designing, system architecture and design, cryptography and many more. The Career domain includes embedded systems, Multimedia, Telecommunications, Computer networks, Computer network security,business applications for commerce, retail, customer relationship management and ERP also Knowledge contents management.Still not sure is computers right option for you? Computers are no doubt one of the most dynamic areas where growth has been immense and is still not showing any signs of slowing down. The importance of this industry is evident from its involvement in every field of life either its lawyer’s firm, bank, corporate company, Hospital or any other sort of business IT is without a question an integral part in performing the relevant tasks with customized software and networking solutions.The computing is although one of the fastest emergent segments of industry, it is also one of the most continuously changing areas in terms of technology. Computing professionals' education is not limited to the college degrees, but continues with seminars, conferences, and advanced courses and training as the new researches emerge thus people wanting to make big in the market should always be their toes and should have a strong ability to predict the future needs. In computer theory and applications, new ideas are developed every day. Success requires an ongoing commitment to learning to maintain knowledge, skills, and career opportunities.

COMPUTER SCIENCE


Computer science (or computing science) is the study of the theoretical foundations of information and computation, and of practical techniques for their implementation and application in computer systems It is frequently described as the systematic study of algorithmic processes that describe and transform information; the fundamental question underlying computer science is, "What can be (efficiently) automated?" Computer science has many sub-fields; some, such as computer graphics, emphasize the computation of specific results, while others, such as computational complexity theory, study the properties of computational problems. Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems, and human-computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to people.
The general public sometimes confuses computer science with vocational areas that deal with computers (such as information technology), or think that it relates to their own experience of computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement software such as games and web-browsers, and using that understanding to create new programs or improve existing ones

ROBO PROGRAMME


In practical terms, a computer program may run from just a few instructions to many millions of instructions, as in a program for a word processor or a web browser. A typical modern computer can execute billions of instructions per second (gigahertz or GHz) and rarely make a mistake over many years of operation. Large computer programs comprising several million instructions may take teams of programmers years to write, thus the probability of the entire program having been written without error is highly unlikely.


Errors in computer programs are called "bugs". Bugs may be benign and not affect the usefulness of the program, or have only subtle effects. But in some cases they may cause the program to "hang" - become unresponsive to input such as mouse clicks or keystrokes, or to completely fail or "crash". Otherwise benign bugs may sometimes may be harnessed for malicious intent by an unscrupulous user writing an "exploit" - code designed to take advantage of a bug and disrupt a program's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design

In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode, the command to multiply them would have a different opcode and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from—each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer just as if they were numeric data.
The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches.

ARCHITECTURE









Nearly all modern computers implement some form of the stored-program architecture, making it the single trait by which the word "computer" is now defined. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture.

Microprocessors are miniaturized devices that often implement stored program CPUs.
Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953.

In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers. By the 1980s, computers became sufficiently small and cheap to replace simple mechanical controls in domestic appliances such as washing machines. The 1980s also witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household.
Modern smartphones are fully-programmable computers in their own right, in a technical sense, and as of 2009 may well be the most common form of such computers in existence.


The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that a list of instructions (the program) can be given to the computer and it will store them and carry them out at some time in the future.
In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction.

Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention.
Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time—with a near certainty of making a mistake.

Wednesday, March 4, 2009

ABOUT COMPUTER ROBOT



It is difficult to identify any one device as the earliest computer, partly because the term "computer" has been subject to varying interpretations over time. Originally, the term "computer" referred to a person who performed numerical calculations (a human computer), often with the aid of a mechanical calculating device.
The history of the modern computer begins with two separate technologies—that of automated calculation and that of programmability.
Examples of early mechanical calculating devices included the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC). Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions and when.[3] This is the essence of programmability.
The "castle clock", an astronomical clock invented by Al-Jazari in 1206, is considered to be the earliest programmable analog computer.[4] It displayed the zodiac, the solar and lunar orbits, a crescent moon-shaped pointer travelling across a gateway causing automatic doors to open every hour,[5][6] and five robotic musicians who play music when struck by levers operated by a camshaft attached to a water wheel. The length of day and night could be re-programmed every day in order to account for the changing lengths of day and night throughout the year.[4]
The end of the Middle Ages saw a re-invigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers. However, none of those devices fit the modern definition of a computer because they could not be programmed.