Skip to main content

Posts

Showing posts from 2017

Computer Mouse

A computer mouse is a handheld hardware input device (also called Pointing device) that controls a cursor in a GUI and can move and select text, icons, files, and folders. For desktop computers, the mouse is placed on a flat surface such as a mouse pad or a desk and is placed in front of your computer.

Differences between analog and digital computers.

Analog Computer: An analog computer is a form of computer that uses the continuously changeable aspects of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved. As an analog computer does not use discrete values, but rather continuous values, processes cannot be reliably repeated with exact equivalence. Digital Computer: A digital computer is form computer that uses letters and numbers as inputs in processing and displays the result in monitors or other forms of output devices and can also store the output in memory to be used later. Following are some key differences between digital and analog computer
Analog Computer No. Digital Computer This type of computer uses continuously changeable aspects of physical phenomena. 1 This type of computer runs by turning on and off electricity which are digital signals. Electrical wave generated from pressure, temperature etc. such changeable data are used as input in analog computers…

Difference between Supercomputer & Mainframecomputer

Difference between Microcomputer, Minicomputer & Mainframecomputer

Assembler

An Assembler is a type of computer program that interprets software programs written in assembly language into machine language, code and instructions that can be executed by a computer.
An assembler enables software and application developers to access, operate and manage a computer's hardware architecture and components.
An assembler is sometimes referred to as the compiler of assembly language. It also provides the services of an interpreter.

Programming language processor that translates an assembly language program (the source program) to the machine language program (the object program) executable by a computer.

Read more: http://www.businessdictionary.com/definition/assembler.html
Programming language processor that translates an assembly language program (the source program) to the machine language program (the object program) executable by a computer.

Read more: http://www.businessdictionary.com/definition/assembler.html
Programming language processor that tra…

COMPILER & INTERPRETER

A compiler is a program that translates the entire source program written in some high-level programming language (such as Java) into machine code for some computer architecture (such as the Intel Pentium architecture).
A computer program which reads entire source code and outputs assembly code or executable code is called compiler.A program that translates software written in source code into instructions that a computer can understand Software used to translate the text that a programmer writes into a format the CPU can use.A piece of software that takes third-generation language code and translates it into a specific assembly code. Compilers can be quite complicated pieces of software. An Interpreter is a program that translates a source program written in some high-level programming language (such as BASIC) into machine code for some computer architecture (such as the Intel Pentium architecture) line by line.
A computer program which reads line by line source code and outputs as…

Fourth generation Language

A fourth generation (programming) language (4GL) is a grouping of programming languages that attempt to get closer than 3GLs to human language, form of thinking and conceptualization.

4GLs are designed to reduce the overall time, effort and cost of software development. The main domains and families of 4GLs are: database queries, report generators, data manipulation, analysis and reporting, screen painters and generators, GUI creators, mathematical optimization, web development and general purpose languages.

Also known as a 4th generation language, a domain specific language, or a high productivity language
Advantages:
1. Simplified the programming process.
2. Use nonprocedural languages that encourage users and programmers to specify the results they want, while the computers determines the sequence of instructions that will accomplish those results.
3. Use natural languages that impose no rigid grammatical rules

Disadvantages:
1. Less flexible that other languages
2. …

Third Generation Language

A third-generation programming language (3GL) is a generational way to categorize high-level computer programming languages. Where assembly languages, categorized as second generation programming languages, are machine-dependent, 3GLs are much more machine independent and more programmer-friendly. This includes features like improved support for aggregate data types, and expressing concepts in a way that favors the programmer, not the computer. A third generation language improves over a second generation language by having the computer take care of non-essential details. 3GLs feature more abstraction than previous generations of languages, and thus can be considered higher level languages than their first and second generation counterparts.
First introduced in the late 1950s, Fortran, ALGOL, and COBOL are early examples of these sorts of languages.
Most popular general-purpose languages today, such as C, C++, C#, Java, BASIC and Pascal, are also third-generation languages…

2nd Generation Language

2nd Generation Language or Assembly Language 

Second-generation programming language (2GL) is a generational way to categorize assembly languages. The term was coined to provide a distinction from higher level third-generation programming languages (3GL) such as COBOL and earlier machine code languages.
Second-generation programming languages have the following properties:
The code can be read and written by a programmer. To run on a computer it must be converted into a machine readable form, a process called assembly.The language is specific to a particular processor family and environment.
It is the first step to improve the programming structure. You should know that computer can handle numbers and letter. Therefore some combination of letters can be used to substitute for number of machine codes.
The set of symbols and letters forms the Assembly Language and a translator program is required to translate the Assembly Language to machine language. This translator program is ca…

First-generation programming language

A first-generation programming language (1GL) is a machine-level programming language.
A first generation (programming) language (1GL) is a grouping of programming languages that are machine level languages used to program first-generation computers. Originally, no translator was used to compile or assemble the first-generation language. The first-generation programming instructions were entered through the front panel switches of the computer system. This is very expensive. There was originally no compiler or assembler to process the instructions in 1GL.
The instructions in 1GL are made of binary numbers, represented by 1s and 0s. This makes the language suitable for the understanding of the machine but far more difficult to interpret and learn by the human programmer.Machine Language is the only language that is directly understood by the computer. It does not needs any translator program. We also call it machine code and it is written as strings of 1's (one) and 0’…

Generation of Computer Languages

1. The first generation languages, or 1GL are low-level languages that are machine language. 2. The second-generation languages, or 2GL are also low-level assembly languages. 3. The third-generation languages, or 3GL are high-level languages such as C. 4. The fourth-generation languages, or 4GL are languages that consist of statements similar to statements in a human language. Fourth generation languages are commonly used in database programming and scripts and that contain visual tools to help develop a program.

Types of Computer

Computers can be broadly classified by their speed and computing power.
Sr.No.TypeSpecifications1PC (Personal Computer)It is a single user computer system having moderately powerful microprocessor2WorkStationIt is also a single user computer system which is similar to personal computer but have more powerful microprocessor.3Mini ComputerIt is a multi-user computer system which is capable of supporting hundreds of users simultaneously.4Main FrameIt is a multi-user computer system which is capable of supporting hundreds of users simultaneously. Software technology is different from minicomputer.5SupercomputerIt is an extremely fast computer which can execute hundreds of millions of instructions per second. PC (Personal Computer)A PC can be defined as a small, relatively inexpensive computer designed for an individual user. PCs are based on the microprocessor technology that enables manufacturers to put an entire CPU on one chip. Businesses use personal computers for word processing…

Computer - Fifth Generation

Computer - Fifth GenerationThe period of fifth generation is 1980-till date. In the fifth generation, the VLSI technology became ULSI (Ultra Large Scale Integration) technology, resulting in the production of microprocessor chips having ten million electronic components. This generation is based on parallel processing hardware and AI (Artificial Intelligence) software. AI is an emerging branch in computer science, which interprets means and method of making computers think like human beings. All the high-level languages like C and C++, Java, .Net etc., are used in this generation.
AI includes:
RoboticsNeural NetworksGame PlayingDevelopment of expert systems to make decisions in real life situations.Natural language understanding and generation. The main features of fifth generation are:
ULSI technologyDevelopment of true artificial intelligenceDevelopment of Natural language processingAdvancement in Parallel ProcessingAdvancement in Superconductor technologyMore user friendly …

Computer - Fourth Generation

Computer - Fourth Generation The period of fourth generation was 1971-1980. The computers of fourth generation used Very Large Scale Integrated (VLSI) circuits. VLSI circuits having about 5000 transistors and other circuit elements and their associated circuits on a single chip made it possible to have microcomputers of fourth generation. Fourth generation computers became more powerful, compact, reliable, and affordable. As a result, it gave rise to personal computer (PC) revolution. In this generation time sharing, real time, networks, distributed operating system were used. All the high-level languages like C, C++, DBASE etc., were used in this generation.
The main features of fourth generation are:
VLSI technology usedVery cheapPortable and reliableUse of PC'sVery small sizePipeline processingNo A.C. neededConcept of internet was introducedGreat developments in the fields of networksComputers became easily available Some computers of this generation were:
DEC 10STAR …

Computer - Third Generation

Computer - Third Generation The period of third generation was 1965-1971. The computers of third generation used integrated circuits (IC's) in place of transistors. A single IC has many transistors, resistors and capacitors along with the associated circuitry. The IC was invented by Jack Kilby. This development made computers smaller in size, reliable and efficient. In this generation remote processing, time-sharing, multi-programming operating system were used. High-level languages (FORTRAN-II TO IV, COBOL, PASCAL PL/1, BASIC, ALGOL-68 etc.) were used during this generation.
The main features of third generation are:
IC usedMore reliable in comparison to previous two generationsSmaller sizeGenerated less heatFasterLesser maintenanceStill costlyA.C neededConsumed lesser electricitySupported high-level language Some computers of this generation were:
IBM-360 seriesHoneywell-6000 seriesPDP(Personal Data Processor)IBM-370/168TDC-316

Computer - Second Generation

Computer - Second Generation The period of second generation was 1959-1965. In this generation transistors were used that were cheaper, consumed less power, more compact in size, more reliable and faster than the first generation machines made of vacuum tubes. In this generation, magnetic cores were used as primary memory and magnetic tape and magnetic disks as secondary storage devices. In this generation assembly language and high-level programming languages like FORTRAN, COBOL were used. The computers used batch processing and multiprogramming operating system.
The main features of second generation are:
Use of transistorsReliable in comparison to first generation computersSmaller size as compared to first generation computersGenerated less heat as compared to first generation computersConsumed less electricity as compared to first generation computersFaster than first generation computersStill very costlyA.C. neededSupported machine and assembly languages Some computers o…

Generations of Computers

The history of computer development is often in reference to the different generations of computing devices. Each of the generation of computers is characterized by a major technological development (switching technology)that fundamentally changed the way computers operate. Most major developments resulted in increasingly smaller, cheaper and more powerful and efficient computing devices. The evolution of computers are categorized in five generations:
1st Generation (Vacuum Tube)2nd Generation (Transistor)3rd Generation (Integrated circuit)4th Generation (VLSI)5th Generation (ULSI, Parallel Computing, Artificial Intelligence) Computer - First GenerationThe period of first generation was 1946-1959. The computers of first generation used vacuum tubes as the basic components for memory and circuitry for CPU (Central Processing Unit). These tubes, like electric bulbs, produced a lot of heat and were prone to frequent fusing of the installations, therefore, were very expensive and c…

CRT (CATHODE RAY TUBE)

CRT stands for Cathode Ray Tubes and is the very old style display which uses the florescent blue tube in itself and it projects the electrons to the screen at a time. These projections are responsible for creating the images on the screen. These monitors are pretty heavy and have been configured for some various sizes. In CRT there is vacuum glass tube under which Electron gun shoots beam of electrons toward the back of monitor screen
It is coated with chemical dots called phosphors glow when electrons strike themBeam of electrons scans the monitor from left to right, and top to bottom in a raster pattern to create the image.Trio of dot phosphors is grouped in triangle for each hardware picture elementElectron beam returns regular to each phosphor to sustain the glow.More dots better qualityDot pitch Measurement between the same spot in two vertically adjacent dot triosExpressed in millimeters or dots per inchDot pitch tells “sharpness”Software-pixel placement is limited to hard…

Display Devices

A display device or VDU (Video Display Unit) is an output device for presentation of information in visual. When the input information is supplied has an electrical signal, the display is called an electronic display.
Common applications for electronic visual displays are televisions or computer monitors.
All of the PCs that we use need to have some displays. Normally there are the standard monitors, but they now are available in various varieties like LCD, LED. The evolution of the displays has not just only made the space that they contain become less, but also has made them more efficient.

Common Terms

Pixel: In digital imaging, a pixel, pel, dots, or picture element is a physical point in a raster image, or the
smallest addressable element in an all points addressable display device; so it is the smallest controllable element of a picture represented on the screen. The address of a pixel corresponds to its physical coordinates. LCD pixels are manufactured in a two-dimensiona…

Laser Printer

Laser printers are non impact printers work on the principle of static electricity i.e. atoms with opposite charges attract each other. The computer or digital camera sends the matter to be printed to the printer. This information is converted into dots by an internal processor. The most important part of the printer is the photoreceptor , a revolving drum which is made of photoconductive material. This drum is given a positive charge. A laser beam is shot at it when there is a dot making that area negatively charged. The laser beam remains off when there is blank space. This is now exposed to laser toner which is positively charged. The positively charged toner pigments get attracted to the negatively charged areas of the electric drum. With one complete rotation, the drum is now covered with the required image. Next the print media , say a paper, which is negatively charged is passed over the drum. The positively charged toner pigments now get attracted to the paper and th…