Tuesday, December 30, 2014


 A form of communication known to both sender and receiver for data transmission over long distances is called as Telegraph and this process is known as Telegraphy. Early examples of the telegraph include drumbeats or smoke signals to exchange information between far-flung points as used in ancient civilizations such as those in China, Egypt and Greece .

However, such methods were limited by the weather. Semaphore was also a modern precursor to the electric telegraph, developed in the early 1790s. A series of hilltop stations makes a semaphore. Each station had large movable arms to signal letters and numbers and two telescopes to see the other stations. The semaphore was susceptible to weather and other factors that hindered visibility like ancient smoke signals.

Before the invention of telegraph, politics and business were constrained by geographic boundaries. There was limited knowledge of national or international news, and that which was shared later on used to be quite dated. After the telegraph, the world has changed. There was a feeling as if information could flow like air. During the middle and end of the 19th century, Electrical telegraphs, which are also known as telegrams, were a highly popular form of communication. But here is still a question left in mind that who invented this telegraph system. Let us have a look over this.

Samuel Morse is considered as the real inventor of the telegraph system and he helped to invent the Morse code system. In addition to his inventions, he was also a critically appraised artist. He was born on April 27, 1791 in Charlestown, Massachusetts, son of a pastor, Jedidiah Morse. Pastor Morse held strong Calvinist teachings and sent his son to Yale to receive religious training. While studying at Yale, Samuel Morse attended a lecture on electricity, but still was an accomplished artist. He supported himself with his paintings.

After completing graduation, Samuel Morse continued his career as an artist and in 1825, New York City commissioned him to paint a portrait of Gilbert de Motier. The painting was left incomplete as days later, he received a message delivered by horseback that his wife was in grave condition. As soon as Morse received the message and reached home, his wife had passed away. It was then that Samuel Morse decided to find a faster and more efficient method of communication than those currently available.

Two forms of the telegraph use electricity. These include the electrical and electromagnetic telegraphs. The electrical telegraph is one that uses electric pulses to send communications over a line or radio. Electromagnetic telegraphs are systems that use a device to transmit signals and data from one person to another.

To transmit messages across telegraph wires, in the 1830s Morse and Vail created Morse code. The code assigned letters in the alphabet and numbers a set of dots (short marks) and dashes (long marks) based on the frequency of use; letters used often (such as “E”) got a simple code, while those used infrequently (such as “Q”) got a longer and more complex code. Initially the code was rendered as marks on a piece of paper that the telegraph operator would then translate back into English, when transmitted over the telegraph system,.

In 1832 Samuel Morse was assisted by Alfred Vail and conceived of the idea for an electromechanical telegraph, called the "Recording Telegraph." This commercial application of electricity was made tangible by their construction of a crude working model in 1835-36. This instrument probably was never used outside of Professor Morse's rooms but was used in a number of demonstrations.

The telegraph was further refined by Morse, Vail, and a colleague, Leonard Gale, into working mechanical form in 1837. In this year Morse filed a patent for it at the U.S. Patent Office. Electricity, provided by Joseph Henry's intensity batteries, was sent over a wire. The key of the device was used for interruption in between the flow of electricity through the wire for shorter or longer periods by holding down the key of the device. The resulting dots or dashes were recorded on a printer or could be interpreted orally. In 1838, Morse reached to the perfection in his sending and receiving code and organized a corporation, making Vail and Gale his partners.

In 1843, Morse built a telegraph system from Washington, D.C., to Baltimore with the financial support of Congress. On May 24, 1844, the first message, “What hath God wrought?” was sent through the telegraph system. Prior to the telegraph, it took days, weeks, and even months for messages to be sent from one location to a far-flung position. Around 1850s there was an advancement in telegraph. It was realized that the clicks of the recording instrument portrayed a sound pattern, understandable by the operators as dots and dashes. This allowed the operator to hear the message by ear and simultaneously write it down. This ability transformed the telegraph into a versatile and speedy system.

After the telegraph cable was stretched from coast to coast in the 1850s, a message from London to New York could be sent in few minutes, and the world suddenly became much smaller through this telegraphic communication system. By the 1850s, predictions about its impact used to be a general topic of discussion. The telegraph would alter business and politics. It would make the world smaller, erase national rivalries and contribute to the establishment of world peace. It would make newspapers obsolete.

Rather than taking weeks to be delivered by horse-and-carriage mail carts, pieces of news could be exchanged between telegraph stations almost instantly.

Later on new advancements were approached by many other inventors in the telegraphic system. Like Thomas Edison's Quadruplex allowed four messages to be sent over the same wire simultaneously, two in one direction and two in the other. In 1883, an English automatic signaling arrangement, Wheatstone's Automatic Telegraph allowed larger numbers of words to be transmitted over a wire at once. It could only be used advantageously, however, on circuits where there was a heavy volume of business. 

Buckingham's Machine Telegraph was also an improvement on the House system. It printed received messages in plain Roman letters quickly and legibly on a message blank, ready for delivery. Vibroplex, about 1890, a semi-automatic key sometimes called a "bug key," made the dots automatically. This relieved the operator of much physical strain.

People were very eager for a faster and easier way of sending and receiving information so they quickly accepted this system of communication over long distances. However, widespread and successful use of the device required a unified system of telegraph stations among which information could be transmitted. 

The Western Union Telegraphy Company, founded in part by Cornell, was at first only one of many such companies that developed around the new medium during the 1850s. By 1861, however, Western Union had laid the first transcontinental telegraph line, making it the first nationwide telegraph company. In 1866, the first permanent telegraph cable had been successfully laid across the Atlantic Ocean; there were 40 such telegraph lines across the Atlantic by 1940.

New technologies began to emerge by the end of the 19th century, however, many of them based on the same principles first developed for the telegraph system. Now this time, these new technologies have overshadowed the telegraph from regular widespread usage. Although the telegraph has been replaced by the even more convenient telephone, fax machine and Internet, its invention stands as a turning point in world history. It laid the groundwork for the communications revolution that led to those later innovations.

Saturday, December 20, 2014


 Since the invention of writing, people had been trying something easier to write on than papyrus or parchment, and also something easier and cheaper to make. But it actually took 3000 years to come up to the invention of paper. 
Paper is believed to be invented around 100 BC in China. Ancient paper pieces from the Xuanquanzhi ruins of Dunhuang in China's northwest Gansu province apparently were made during the period of Emperor Wu who reigned between 140 BC and 86 BC.

But it was Ts'ai Lun who was the actual inventor of paper. His role in developing a material for papermaking deserves the place of honor in Chinese history for revolutionized his country. In 105 AD, the Han Dynasty emperor Ho-Ti, a government official in China named Ts'ai Lun was the first to start a paper-making industry.

 Ts'ai Lun had mixed the finely chopped mulberry bark and hemp rags with water, mashing it flat, and then pressing out the water and letting it dry in the sun for making paper. He might had based his idea on bark cloth, which was very common in China and also made from mulberry bark. Ts'ai Lun's paper was a big success, and began to be used all over China. 
The paper was soon very popularly used in China and spread to the rest of world through the Silk Road.

In few years, the Chinese began to use paper for writing. By 740 A.D., The first printed newspaper was seen in China.

The real advancement in papermaking came with the development of a smooth material for the mold covering, which made it possible for the papermaker to free the newly formed sheet and reuse the mold immediately. This covering was made from thin strips of rounded bamboo stitched or laced together with silk, flax, or animal hairs.

Papermaking was moved to Korea as early as the 6th century AD. Pulp for paper was prepared from the fibers of hemp, rattan, mulberry, bamboo, rice straw, and seaweed. According to tradition, a Korean monk named Don-cho brought papermaking to Japan by sharing his knowledge at the Imperial Palace in approximately 610 A.D.

Taught by Chinese papermakers, This technique eventaully reached Tibet around 650 A.D. Tibetans began to make their own paper as a replacement for their traditional writing materials. The shape of Tibetan paper books still reflects the long, narrow format of the original palm-leaf books. Chinese papermakers also spread their craft into Central Asia and Persia,

It was later introduced into India by traders after 645 A.D. When Hsuan Tsang arrived India in 671 A.D., paper was already very popularly used there.

After a little more than 500 years, the Arabs learned the paper-making from the Chinese prisoners during a big battle in 751 AD in Samarkand and built the first paper industry in Baghdad in 793 A.D. Now people in the Abbasid Caliphate began to use paper.

The Egyptians learned the paper making from the Arabs during the early 10th century. Around 1100 A.D. paper arrived in Northern Africa. 
By 1150 A.D. it arrived to Spain as a result of the crusades and established the first paper industry in Europe.
 In 1453 A.D. Johann Gutenberg invented the printing press. The first paper industry in the North America was built in Philadelphia in 1690.

Wednesday, December 17, 2014


Now-a-days we often use hair dryers for hair styling or for hair care. Hair dryers are hot hair tools that are portable and hand held and are very easy to use. Also Hair Dryers contains any type of blower which dries hair. But those used in the past used to be a big structure. Let us have an eyeview over its invention and evolution.

Before the invention of the hair dryer, the common method for drying hairs was the use of a vacuum cleaner.

In 1890 a Frenchman named Alexander Godefroy invented a hair dryer to be used in his salon by taking inspiration from the vacuum cleaner.. This dryer was a sit-down structure dryer with a large hair hood – similar to the ones we can see in hair salons today. It had a hard plastic dome that comes down and fits over a person's head for drying the hairs. Hot air is blown out through the tiny openings around the inside of the dome so the person's hair is dried evenly.

But in 1911, Armenian-American inventor Gabriel Kazanjian received the first patent for a hand-held hair dryer.

He was not the only one person working on hair-drying technology . There was a rush of inventors working on improvement of the hair dryer from 1920’s. Lots of these were working with the proven salon power dryer. Patents were awarded to designs for adding a magazine stand/ashtray combo, and for putting speakers into the dryer so clients could listen to music while they waited for their hair to dry.

Due to the innovations by National Stamping and Electricworks under the white cross brand and later U.S. Racine Universal Motor Company and the Hamilton Beach Co. allowed the dryer to be handheld. By 1915, handheld dryers were available in the market.

Even in the 1920s, the new dryers were often heavy, weighing approximately 2 pounds (0.91 kg), and were very difficult to use. They also had many cases of overheating and electrocution. It was also only capable of using 100 watts, so it took a long time to dry hair.

There were two other major types of blowdryers other than the handheld- the bonnet hair dryer and the rigid-hood dryer. The bonnet dryer was introduced in the market for sale in 1951. This type of dryer worked by having the dryer, usually in a small portable box, connected to a tube that went into a bonnet having holes in it that could be positioned on top of a person's head. This worked by giving an equal amount of heat to the whole head at once.

The rigid-hood hair dryer, most frequently seen in salons,had a hard plastic helmet that goes over the head. This dryer works similarly to the bonnet dryer but at a much higher wattage.

An important change to the hand held hair dryer occurred in 1954 when GEC changed the design of the dryer to move the motor inside the casing.

Another important change occurred in hair dryers in 1960s that was dealing with the weight of the blow dryer. Better electrical motors were introduced at that time along with the lighter plastics. In the 1970s the Consumer Product Safety Commission (CPSC) set up hairdryer safety guidelines which addressed past issues with electrocution or overheating.

By the 1970’s, hairdryers were safe and powerful enough to be successfully mass marketed, and hairstylists were also using them, inventing new hairstyles to take advantage of the “new” technology.

Since 1991 the CPSC has mandated by US law that all dryers must use a ground fault circuit interrupter so that it cannot electrocute a person if it gets wet. By 2000 deaths by blow dryers has dropped to less than four people a year. This is a drastic change from the hundreds of electrocution cases which occurred in the past.

The average dryer today can use up to 2000 watts of heat. So it takes generally few minutes for drying the hairs, also there is no danger of electric shocks. Moreover they are light in weight. So in all it is a very useful tool. We can’t deny this fact.

Monday, December 15, 2014


Earlier, people used to clean their clothes by pounding them on hard rocks or scrubbing them with abrasive sands and wash away the dirt in rivers or lakes. Later, washboards were developed that helped streamline the process. Bleaching was necessary even back in Roman times. In order to have the whitest whites, clothing was taken to a "fuller" who put the clothing in a tub of fermented urine and agitate the clothing with his feet. For centuries, people on sea voyages washed their clothes by placing the dirty laundry in a strong cloth bag, and tossing it overboard, letting the ship drag the bag for hours. This forces water through clothes to remove dirt.
But washing clothes this way took a lot of time and efforts also. There was an obvious need for technology to make it easier to get clothing clean more quickly.
 The first washing machine was called as the scrub board invented in the year 1797. The agitation of the clothing was necessary to get it clean. So the washing machines were hand operated when electricity was not available to all.
In 1851, American James King discovered and patented the first washing machine to use a drum. However it was still hand powered and did not work by itself.
Then in 1858, William Blackstone of Indiana built a washing machine for his wife. This washing machine removed and washed away dirt from the clothes. This was the first official washing machine that could be used at home.
Early washing machines were made from wood but later on they were made from metal. The metal machines could have a fire built under them to keep the water conveniently hot for each load of clothing. The wringing process was made quicker and easier with the help of rollers and wringers but it still took a very long time.

Electricity was not commonly available to all homes until 1930 or so. Washing machines that were automatic rather than cranked by hand often run on a single cylinder, gasoline powered engine.
In 1908, Hurley Machine Company of Chicago invented the Thor machine. This was the first washing machine that was electric-powered and had a galvanized tub and an electric motor. The earlier washing machines had heavy, cast-iron mechanism mounted on the tub lid. This made the washing machine heavier and bulky.
By 1920 the production of coppered wooden tub completely stopped. Also, the machine made lot of noise because of the bulky metal tub. The tub was now copper ribbed and made of metals like nickel or nickel-chromium plated. Another modification that was made was covering the tubs with white enamelled sheet metal to make cleaning process easier.
From a technological perspective, washing machine manufacturers faced a number of challenges. These included discovering a method of transferring power from the motor to the mechanism, finding a suitable motor with sufficient initial starting torque, and ensuring that the operator did not get an electrical shock during operation.
In the transference of power, some washing machines were chain driven, some belt driven and others used shafts and gears.

To overcome the initial resistance in starting a washing machine, a fractional horsepower motor which would not burn out or overheat during the start-up period was used. This was usually a 1/8 or ¼ horsepower motor, manufactured out-of-house by Westinghouse or Robbins and Myers.
To prevent electric shocks, the stator and rotor of the machine were enclosed in a housing equipped with a fan to prevent overheating. From the customer satisfaction perspective, a machine that would wash without shredding the clothes needed to be developed. This meant that if the original scrubbing machines were used, the machine had to be operated at different speeds for different textiles. To overcome the problem, washing machines that sloshed water through the clothing by agitation were developed. Either the tub moved or a baffle placed inside the tub moved.

In 1937, Bendix received the patent for and introduced the first fully automatic washing machine.
In 1947, General Electric introduced the first top loading automatic washing machine. This machine had many of the same features found in modern machines. It was equipped with 5 push buttons to control wash temperature, rinse temperature, agitation speed and spin speed.
In the 1950s the mechanical timer was developed and simplified clothes washing for the homemaker. Washing machines did not change much for the next forty years.
In the 1990s James Dyson created a washing machine that had two cylinders rotating in opposite directions. This was said to make the clothes cleaner and shorten wash time.
In 1994 Staber Industries introduced the System 2000 washing machine. It is a top loading, horizontal axis washer. Because the tub spins like a front loading machine it uses only about one third of the water normally used by a washing machine.
In 2006 Sanyo introduced a washing machine that uses only 50 L of water because it recycles the water used.
In 2008, the University of Leeds created a prototype machine that used only one cup of water per load of laundry. This would save huge amounts of water every year.

Sunday, December 14, 2014


We all wear clothes either readymade or from stitched somewhere. It is quite obvious that  to make out a dress from factory manufactured cloth, cutting and stitching is required. Sewing machine is a great invention for the ease of stitching. It takes just few hours to make any kind of dress which generally took few days when there was no sewing machine. It has reduced the labour cost and is more efficient than earlier. Let us have a view over the history of sewing machine invention.
Before 1755, there is no mention of any kind of invention related to sewing in history.  It was in 1755 in London where a German immigrant, Charles Weisenthal, took out a patent for a needle. He used it  for mechanical sewing.  After another 34 years there was no invention in this field. Englishman Thomas Saint’s invention is generally considered to be the first real sewing machine.
In 1790 the cabinet maker patented a machine. It had an awl which made a hole in leather and then allowed a needle to pass through in order to stitch leather. Evidences says that Saint only patented an idea and that  machine was never built. It is known that when an attempt to make such a machine from Saint's patented idea,  it would not work without considerable modification.
Now the story  moves to Germany where, in around 1810, inventor Balthasar Krems developed a machine for sewing caps.
An Austrian tailor Josef Madersperger produced a series of machines during the early years of the 19th century and received a patent in 1814. Two more inventions were patented in 1804, one in France to a Thomas Stone and a James Henderson  for a machine which attempted to emulate hand sewing and second was a  Scottian  John Duncan who made  an embroidery machine using a number of needles.

                                                                                                                                                                                                                    In 1818, a Vermont churchman John Adams Doge and his partner John Knowles invented a device which could  sew a very short length of material which was not possible earlier.
Real or actual  inventor of the sewing machine must be Barthelemy Thimonnier who, in 1830, was granted a patent by the French government. He used a barbed needle for his machine which was built almost entirely of wood. It is said that he originally designed the machine to do embroidery, but then saw its potential as a sewing machine.
He was able to convince the authorities of the usefulness of his invention and  was eventually given a contract to made a batch of such machines and use them to sew uniforms for the French army.
Here Thimonnier established  a factory running with 80 machines, but then ran into trouble from Parisian tailors. If his machines were successful there was a fear among the laborers that they will become workless and will get no wages for hand stitching.

Late one night a group of tailors stormed the factory and destroyed every machine. With a new partner he started again, produced a vastly- improved machine and was set to go into full-scale production; but the tailors attacked again. Thimonnier took a little help from the police or army in france and fled to England with the one machine he was able to salvage.
He certainly produced the first practical sewing machine. He was the first man to offer machines for sale on a commercial basis and ran the first garment factory. For all that, he died in the poor house in 1857.
In 1833, America a quaker Walter Hunt invented the first machine which made a lock stitch using two spools of thread and incorporated an eye-pointed needle as used today. But again it was unsuccessful .

Nine years later Hunt's countryman, John Greenough, invented  a working machine in which the needle passed completely through the cloth.
In early 1844,  Englishman John Fisher invented a machine which although designed for the production of lace, was essentially a working sewing machine. Probably because of miss-filing at the patent office, this invention was overlooked during the long legal arguments between Singer and Howe as to the origins of the sewing machine.

Desperately in debt Howe sent his brother Amasa to England with the machine with the hope that it would receive more interest on the other side of the Atlantic. Amasa could find only one backer, a corset maker William Thomas, who eventually bought the rights to the invention and arranged for Elias to come to London to further develop the machine.
The two accusing each other for failing to honour agreements and eventually Elias returned to America. When he arrived home he found that the sewing machine had finally caught on and that dozens of manufacturers, including Singer, were busy manufacturing machines.

Sunday, December 7, 2014


For writing on paper, we all must have used pens in our daily routine. There are a variety of pens available in the market and we choose among them to have a beautiful handwriting. But have we ever think what kind of mechanism is there inside the pen and what was its history and how its evolution gave us such a beautiful creation of minds. Lets have a look.

Around 4000 BC , moist Clay tablets were used for writing by scratching with bronze or tools made of bones.

Around 3000 BC, Eygptians painted hieroglyphics with brushes made from marsh reeds, a form of writing with pictures. Thin reed brushes or reed pens were used for writing on papyrus scrolls scribes. The Sumerians used wedge-shaped reed pens to cut pictorial shapes into clay tablets and the ancient Chinese wrote with brushes of stiff hair. Ancient Greeks and Romans sharpened stiff reeds to a point, resulting in chirography that was taut and precise.

The Romans developed new form of writing. They scribed into thin sheets of wax (on wooden tablets). Romans used a metal stylus. They rubbed it out with the flat end of stylus.  In Asia scribes used a bronze stylus.  

During 600-1800 AD ,Writing on parchment with a quill pen altered the style of Europeans writing. People used to make pens from the wing feathers of birds such as geese and swans. The shaft of the feathers was hardened, and the writing tip was shaped and slit to make writing easy. These feather pens were known as quill pens. The quill pen was favoured by writers for over 1,000 years. The soft quill was honed to a point, split at the tip to permit ink to flow freely, and constantly resharpened.

During 1800-1850, a metal pen point has been patented in 1803 but patent was not commercially exploited. Joseph Gillott's invention of the steel pen nib came into common use in the 1830s which required no sharpening and could be separated from the pen body and changed as needed. By the 19th century metal nibs had replaced quill pens. By 1850 quill pen usage was fading and the quality of the steel nibs had been improved by tipping them with hard alloys of Iridium, Rhodium and Osmium. Still, the writer constantly dipped pen into ink, hoping to avoid drips. 

Lewis Edson Waterman, insurance broker, invented the first proper fountain pen in 1884.  

 Fountain pens store ink inside a reservoir within the pen, the nib thus supplied with a constant stream of ink. Alonzo Cross featured a "stylographic pen" with an ink-depositing needle point in the late 1860s, but blots and smears were still common.
The first patent for this invention was issued on October 30, 1888, to a man named John J. Loud. Loud’s invention featured a reservoir of ink and a roller ball that applied the thick ink to leather hides. While this invention worked, it was not well suited for paper because of the ink. If the ink was thin, the pens leaked, and if it was too thick, they clogged. Depending on the temperature, the pen would sometimes do both.

The first man to actually develop and launch a ball-point pen was the Hungarian Laszlo Jozsef Biro from Budapest, who in 1938 invented a ball-point pen with a pressurized ink cartridge. He is considered as the inventor of today's ball-point pen.,

While working as a journalist, he was frustrated by the amount of time wasting in filling fountain pens and cleaning up ink smudges.  Besides that, the sharp tip of his fountain pen often scratched or tore through the newsprint (paper).

Biro noticed that the ink used in newspaper printing dried quickly, leaving the paper dry and smudge-free. He got the idea to use the same type of ink for writing instruments. Since the thicker ink would not flow from a regular pen nib, he fitted his pen with a tiny ball bearing in its tip. Moving along the paper, the ball rotates picking up ink from the ink cartridge and leaving it on the paper.

At the very end of 1938, just one day before anti-Jewish laws became active in Hungary, Bíro fled to Paris before emigrating to Argentina. Augustine Justo, who happened to be the president of Argentina, urged them to set up a factory in Argentina.  When World War II broke out in Europe, a few years later, the Biros fled to Argentina, stopping in Paris along the way to patent their pen.

He gave him his signed card which should allow Bíro to obtain a hard to get visa for the South American country; In 1943 Bíró obtained a new patent in Argentina and became the country's leading producer of ball-point pens. He had set up a manufacturing plant.
The Biro pen depended on gravity for the ink to flow to the roller ball. This meant that the pens worked only when they were held more or less straight up, and even then the ink flow was sometimes too heavy, leaving smudgy globs on the paper. 
The Biro brothers returned to their laboratory and invented a new design, which relied on "capillary action" rather than gravity to feed the ink.  The rough "ball" at the end of the pen acted like a metal sponge, and with this improvement ink flows more smoothly to the ball, and the pen could be held at a slant rather than straight up. 

The British government bought the patent as the pen's functioning was not affected by high altitude air pressure and would thus be of use to navigators in airplanes. In 1944, a pen under the brand name ‘Biro’ was produced.

In an attempt to corner the market, the Eberhard Faber Company paid the Biro brothers $500,000 for the rights to manufacture their ballpoint pen in the United States. Eberhard Faber later sold its rights to the Eversharp Company, but neither was quick about putting a ballpoint pen on the market. There were still too many bugs in the Biro design.

Shortly afterwards in 1945, the Chicago businessman Milton Reynolds brought some of Biro's pens from Argentina to the US. It was Reynolds who made the deal with Gimbels to be the first retail store in America to sell ballpoint pens.  He set up a makeshift factory with 300 workers who began stamping out pens. In the months that followed, Reynolds made millions of pens and became fairly wealthy, as did many other manufacturers who decided to cash in on the new interest.

In September 1945, Julian Levy, Milton Reynolds' son-in-law, had asked Paul C. Fisher to help improve their pen not yet launched. After two days of testing, Fisher declined the offer because he came to the conclusion that "the basic principle is not sound".

In 1953 Fisher invented the "Universal Refill" which could be used in most pens. It was a good seller since store owners could reduce their stock of assorted refills. Fisher continued to improve his refill and
In 1953, the French Baron, Bich, developed the industrial process for manufacturing ball point pens that lowered the unit cost dramatically (BIC, Co.) 

However, in 1954, Parker Pens introduced its first ballpoint pen called’ The Jotter’ which became a success.
In 1960s, Papermate's Flair, the first pen with a retractable ballpoint tip with no-smear ink. was among the first felt-tip pens to hit the U.S. market in the 1960s, and it has been the leader ever since. Following their initial success with felt-tips, manufacturers branched out with a variety of fiber-tipped instruments, including newly popular highlighters.

In 1966, Fisher came up with a perfect solution using thixotropic ink. It remains semisolid until the shearing action of the rolling ball liquefies it. The ink flows only when needed. The cartridge is pressurized with nitrogen so that it does not rely on gravity to make it work. It writes in freezing cold, desert heat, underwater and upside down.

In 1980s-1990s, the introduction of the roller ball pen have been made. Unlike the thick ink used in a conventional ball point, roller ball pens employ a mobile ball and liquid ink to produce a smoother line. Technological advances achieved during the late 1980s and early 1990s have greatly improved the roller ball's overall performance. 

 In 1990s, Rubberized writing instruments are commonly used by the companies to reduce the grip. In 1997, Ring Pens' mass production (GRANDEE Corporation).  These pens designed to write without gripping the pens with 3 fingers. 



Thursday, December 4, 2014


A vacuum cleaner (also called a vacuum or hoover or a sweeper) is a device that uses an air pump to create a partial vacuum to suck up dust and dirt, usually from floors.

The first attempts to provide a mechanical solution to floor cleaning were begun in England in 1599. Before vacuum cleaners, rugs were hung over a wall or line and hit repeatedly with a carpet beater to remove out as much dirt as possible.

In United States, the first hand-powered cleaner called the “Whirlwind” used the vacuum principle and was invented in Chicago, Illinois in 1865 by Ives W. McGaffey. Whirlwind was not motorized but used a hand crank to operate a fan to generate suction. This design was not very heavy, but was hard to operate because the operator had to turn the crank while pushing it over the rug or floor. McGaffey obtained a patent for the machine on June 5, 1869 and started the American Carpet Cleaning Co.

John Thurman invented gasoline-powered vacuum cleaner in 1899 and it was considered as the first motorized vacuum cleaner. Thurman's machine was patented on October 3, 1899.

Another person who has helped the evolution of the vacuum cleaner is Hubert Cecil Booth who was a British engineer. He observed a device in which a blast of air was used to remove the dust from the chairs on the train.

He thought to replicate the opposite effect and tried sucking with his mouth against the back of a plush seat in a restaurant. He understood that he would have to find an alternative to filter the air and trap the dust.

He received a British patent for his suction cleaner in 30th August, 1901.

It was a large, horse-drawn, petrol-driven unit which was parked outside the building to be cleaned with long hoses being entered through the windows to clean the rooms. In 1901, first demonstration of this vacuum cleaner was done in a restaurant.

This vacuum cleaner was made of an internal-combustion engine that used gas and did not include any brushes. His invention was well received by British Royalty and this opened up the door for other inventors to try and improve the vacuum cleaner.

Two Americans improved the booth’s design of vacuum cleaner. Corinne Dufour invented a device that sucked dust into a wet sponge and David E. Kenney invented a huge machine that was installed in the cellar and connected to a network of pipes leading to each room in the house.

Between 1903 and 1913, New Jersey inventor David T. Kennedy was granted nine patents for machines similar to the Puffing Billy and established the Suction Cleaner Company and the American vacuum cleaner industry.

Walter Griffiths developed an improved manual vacuum cleaner in Birmingham, England in 1905. The operator pumped a bellows-like contraption to suck up dirt through a flexible pipe; this was the first device that resembled a modern vacuum cleaner.

In 1907, James Murray Spangler, a janitor in a Canton, Ohio department store, found that the carpet sweeper he used was the source of his cough. He was allergic to dust but could not afford to leave his job.

He decided to find a solution for this and tried to improve the way he swept the carpet. He invented a portable electric vacuum cleaner.

It was made up of an old fan motor attached to a soap box stapled to a broom handle. He used a pillow case as a dust collector on the contraption. He then improved his basic model to use both a cloth filter bag and cleaning attachments.

Spangler patented his rotating-brush design in 1908 and formed the Electric Suction Sweeper Company. One of the first buyers was a cousin, Susan Hoover, the wife of a leather goods manufacturer, and she was really impressed with the cleaner. Her husband, William H. Hoover bought the rights from James and made him a partner in Hoover’s Electric Suction Sweeper Company.

He became the president of the Hoover Company, with Spangler as superintendent. He invested money for the manufacturing of this vacuum cleaner. Hoover’s improvements resembled a bagpipe attached to a cake box, but they worked.

Sluggish sales were boost up by Hoover’s 10 day, free home trial, and eventually there was a Hoover® vacuum cleaner in nearly every home. The Hoover Company is still a leading manufacturer of vacuum cleaners in Great Britain. The Hoover Company opened the door for other companies like Kirby, Oreck, and Dyson.

The first "portable" electric vacuum was invented in 1905 by Chapman and Skinner in San Francisco. It weighed 92 pounds and used a fan 18 inches in diameter to produce the suction. Because of its size, it did not sell well.

Modern vacuum cleaner are light in weight and small and compact in size and more cleaning attachments are being added to it to make it all in one cleaning unit. Let us see how it works. When vacuum cleaner is turned on, it turns on the motor that is attached to a fan. As the angled fan blades rotate, air is pushed forward in the direction of the exhaust port. For creating suction inside the vacuum cleaner, the air particles are forced ahead and the pressure behind the fan goes down. Friction carries the debris into the vacuum bag trapping the dirt and dusts. Most vacuum cleaners today have brushes to loosen up the debris and it makes them more effective and attractive also.


We all must have seen Refrigerator at homes as essential household appliances now-a-days. Also we know very well it is used for preservation of food but have we ever think how people used to store or preserve food centuries ago when the refrigerator was not invented. How the need for its invention become necessary and what were the evolutions came in its invention journey. Let’s have a look.

Food preservation was not always so easy like today with Refrigerators because Bacteria are present in all food and at room temperature they multiply rapidly, making food inedible and even dangerous. Centuries ago, Most people rely on canning, pickling and salting their food to preserve it. Also, bacterial activity slows dramatically in cold temperatures. At freezing point or below, it stops. So keeping food cold can maintain its freshness for days when it might otherwise last only hours. Freezing can preserve it for months or years.

People used to gather ice from streams and ponds and store it year round underground to use during summers to keep things chilly.

 According to the Association of Home Appliance Manufacturers, middle step between storing ice underground and modern refrigeration was the icebox. Wooden ice boxes lined with tin or zinc, filled with sawdust were used to store ice preventing it from melting.

Thomas Jefferson's employed the process of maintaining the ice house at the Monticello Estate. Every winter, he brought in more than 60 wagonloads of ice from the nearby Rivanna River to keep his ice house filled.
There are proofs indicating that the Egyptians, Chinese and Indians used ice in food preservation.
In 1626, Sir Francis Bacon was also testing the idea that cooling meat could be used to preserve it but his chilly experiment caused him to develop pneumonia, from which he died on Easter Day, April 9, 1626.

The first known artificial refrigeration method was demonstrated by William Cullen at the University of Glasgow in 1748. However, he did not use his discovery for any practical purpose.

Benjamin Franklin put the first step toward the invention of the refrigerator. In 1758, he along with chemist John Hadley experimented with the effects of evaporation on temperature. By repeatedly swabbing ether on a thermometer and allowing it to evaporate, the pair was able to drop the thermometer's temperature to well below freezing point.
In 1805, an American inventor, Oliver Evans, worked on the principles developed by Franklin and designed the first refrigeration machine.

In Refrigeration machine, a "refrigerant" gas is forced, by compression, to condense into liquid and then evaporate back into gas. As it does so it draws heat from inside the machine and makes it cooler. This cycle repeats again and again, so the fridge is kept constantly cold.
In Philadelphia, Evans became friends with a young inventor called Jacob Perkins. Perkins took Evans' design and started modifying it, received a patent on his own design in 1834. He then persuaded a man named John Hague to construct the machine, and the refrigerator was born. It used ether in a vapor compression cycle.

An American physician, John Gorrie, built a refrigerator based on Oliver Evans' design in 1844 to make ice to cool the air for his yellow fever patients.  He also pioneered air conditioning at the same time, since his idea was to blow air across the ice-making machine to cool hospital patients suffering from malaria in Florida.

In 1856 another American, Alexander Twinning, began selling a fridge based on compressing gas. 
In 1859, France’s Ferdinand Carre created a more advanced system that used ammonia as a coolant. The earlier vapor-compression machines used air. The ammonia worked well, but was toxic if it leaked.

Fred W. Wolf invented DOMELRE, an air-cooled refrigeration unit. It was the first commercially viable electric refrigerator in the United States. It was mounted on top of an icebox. 

In 1915, Alfred Mellowes designed an electric refrigeration unit that was self-contained; the compressor was in the bottom of the cabinet. Guardian Refrigerator Company started manufacturing and selling Mellowes’ version of the refrigerator in 1916.
W.C. Durant, who was president of General Motors, purchased the Guardian Refrigerator Company privately, and the business was renamed Frigidaire. The first Frigidaire refrigerator was completed in September 1918 in Detroit. 

Continued improvements in the production of refrigerator, along with organizational changes in the company resulted in a better product with reduced price. Frigidaire eventually added ice cream cabinets to models in 1923, soda fountain equipment in 1924, and water and milk coolers in 1927. By 1929, 1 million refrigerators had been produced, a marked improvement from the early years.

It still took another 40 years, around the end of World War I, before fridges were being mass produced and cheap enough for ordinary consumers. The first combined fridge-freezer came along in the US around 20 years later in 1939.

Refrigerators from the late 1800s until 1929 used the toxic gases ammonia (NH3), methyl chloride (CH3Cl), and sulfur dioxide (SO2) as refrigerants. Several fatal accidents occurred in the 1920s when methyl chloride leaked out of refrigerators.

Three American corporations launched collaborative research to develop a less dangerous method of refrigeration; their efforts lead to the discovery of Freon.

 In just a few years, compressor refrigerators using Freon would became the standard for almost all home kitchens. But Freon emits chlorofluorocarbons which are very harmful for the ozone layer in our atmosphere.

Research are being going on for a new alternative in place of Freon to design an eco-friendly Refrigerator. Hope we will see it soon.

Monday, December 1, 2014


We all are very well familiar with the word ‘Television’ in our life. Now-a-days it is very popularly called as ‘Idiot Box’. We can’t imagine living without television. In our busy life schedules, it is a good source of entertainment for all age groups. Also it plays an important role in advertising and marketing.

But we must not be familiar with the history of television that how it was invented and how its evolution has today reached to the LED TVs.Let’s have a brief over this.

After the invention of telephone in 1878, the idea of transmission of images in motion through electric signals was firstly sketched as telephonoscope. At the time, it was imagined that someday light could be transmitted over copper wires as sounds was transmitted through telephone.

In 1881, process of scanning was used to transmit images practically in the pantelegraph through the use of a pendulum -based scanning mechanism. From then, scanning in one form or another has been used for converting a visual image into electric pulses for transmission. This process is called as "rasterization".

In 1883, Discovery of the property of selenium that the electrical resistance of selenium varies with light; created the technical possibility that pictures could be translated into electrical signals.

In 1884, Paul Gottlieb Nipkow, a 23-year old student in Germany, developed the first ever mechanical module of television. He used a scanning disk that was spinning with a series of holes towards the center, for rasterization. Spacing between the holes at equal angular intervals was such that, in a single rotation, the disk would allow light to pass through each hole and onto light-sensitive selenium sensor which produced the electrical signals. Each hole captured a horizontal slice of the entire image. This technology was known as the ‘electric telescope’ that had 18 lines of resolution but moving images were still not possible due to the poor sensitivity of the selenium sensors.

In 1897, the Cathode Ray Tube (CRT) was developed and improvements were made in amplifying the electrical signals. But due to the slow reaction of selenium to light, it was quite difficult to transmit changing images.

In 1907, A.A. Campbell-Swinton from England and Russian scientist Boris Rosing became the first inventor to use a CRT to receive electrical signals of an experimental television system. He used mirror-drum scanning to transmit simple geometric shapes to the CRT.

In 1923, an American inventor, Charles Jenkins used the Nipkow’s disk idea to invent the first ever practical mechanical television system. By 1931, his Radiovisor Model 100 was being sold in a complete kit as a mechanical television.

But still the task of achieving the speed and coordination between transmitter and receiver to broadcast moving pictures was difficult.

In 1926, a Scottish inventor, John Logie Baird was the first person succeeded in transmitting moving monochromatic images through the mechanical disk system started by Nipkow. Baird's scanning disk produced an image of 30 lines resolution, just enough to discern a human face.

In 1926, also an Hungarian engineer designed a television system based on fully electronic scanning and display elements and used the principle of "charge storage" within the scanning tube.

On 25 December 1926, Takayanagi demonstrated a TV system with a 40-line resolution that employed a CRT display in Japan. This was the first working example of a fully electronic television receiver.

By 1927, Russian inventor Léon Theremin developed a mirror-drum-based system which used interlacing to achieve an image resolution of 100 lines.

In 1927, Philo Taylor Farnsworth was able to invent a working model of electronic television that was based on Swinton’s idea to use CRT for electronic television.

In 1927, Baird also invented the world's first video recording system,"Phonovision;" because the signal produced by his 30-line equipment was in the audio frequency range, he was able to capture it on 10-inch gramophone records using conventional audio recording technology.

In 1928, there were about 15 American television stations. The pictures used for broadcasting were mostly experimental. But in 1933, because of poor picture quality they disappeared. The need for an adequate scanning system was necessary

In 1930, Zworykin and Philo Farnsworth made advancements. Farnsworth came up with a system that could transmit only bright pictures but incorporated elements of a superior scanning system.

In 1931, Despite decades of effort, there was a limit to how rapidly a mechanical scanner could work. The television in 1930 was fuzzy. It was estimated that 7 million pictures per second need to be transmitted for good quality.

At the Berlin Radio Show in August 1931, Manfred von Ardenne gave the world's first public demonstration of a TV system using a cathode ray tube for both transmission and reception. The world's first electronically scanned TV service began in Berlin in 1935. In August 1936, the Olympic Games in Berlin were carried by cable to TV stations in Berlin and Leipzig where the public could view the games live.

In 1935, the German firm and the United States firm Farnsworth Television owned by Philo Farnsworth signed an agreement to exchange their television patents and technology to speed development of TV transmitters and stations in their respective countries.

On 2 November 1936, the BBC (British Broadcasting Corporation) began transmitting the world's first public regular high-definition service from the Victorian Alexandra Palace in north London. It therefore claims to be the birthplace of TV broadcasting as we know it today. In 1941, The Federal Communications Commission settled on the 525 line screen, 30 frames per second, and 6 megahertz band width standard that is in use today for government regulation standards. 30 stations were licensed but World War II delayed the spread of television. But still there was broadcasting of black and white images only, no colored images.

Color broadcast can be created by broadcasting three monochrome images, one each in the three colors of red, green and blue (RGB). When displayed together or in either rapid succession or optically overlapped, these images will blend together to produce a full color image as seen by the viewer.

One of the great technical challenges in broadcasting colored images was the desire to conserve bandwidth potentially three times that of the existing black-and-white standards and not use an excessive amount of radio spectrum.

Although all-electronic color was introduced in the US in 1953 but the first national color broadcast (the 1954 Tournament of Roses Parade) occurred on January 1, 1954. It was not until the mid-1960s that color sets started selling in large numbers, due in part to the color transition of 1965 in which it was announced that over half of all network prime-time programming would be broadcast in color that fall. The first all-color prime-time season came just one year later.

Early color sets were either floor-standing console models or tabletop versions nearly as bulky and heavy.The introduction of GE's relatively compact and lightweight Porta-Color set in the spring of 1966 made watching color television a more flexible and convenient proposition. In 1972, sales of color sets finally surpassed sales of black-and-white sets. Also in 1972, the last holdout among daytime network programs converted to color, resulting in the first completely all-color network season.

Color broadcasting in Europe was not standardized on the PAL format until the 1960s, and broadcasts did not start until 1967. By this point many of the technical problems in the early sets had been worked out, and the spread of color sets in Europe was fairly rapid.

By the mid-1970s, the only stations broadcasting in black-and-white were a few high-numbered UHF stations in small markets and a handful of low-power repeater stations in even smaller markets such as vacation spots. By 1979, even the last of these had converted to color, and by the early 1980s B&W sets had been pushed into niche markets, notably low-power uses, small portable sets, or use as video monitor screens in lower-cost consumer equipment in the television production and post-production industry.

As we move on from black and white televisions to colored televisions, now the time has moved on also. Television has become so compact,sleek and thin with more added features like woofer and high definition LED TVs that create a cinema hall like feeling in the house.

Friday, November 28, 2014


When we talk about mass communication, one of the device came in our mind is the Radio. Actually Radio was the first device which made mass communication possible. Information can be transferred miles of distance though Radio very easily. Here we will see how Radio was invented.
In 1885, Heinrich Hertz was the first to prove that electric waves could be transmitted and received wirelessly in the form of electromagnetic waves. Today it is recognized as the fundamental building block of radio and every frequency measurement is named after him(Hertz).

In 1891, the radios were used mainly for maintaining contact between ships out a sea. However, contact was not in the form of speech, but  Morse code messages were sent back and forth between ships and stations on the land. During time of distress, a sinking ship would use a radio messaged nearby vessels and stations on the land to ask for help. The radio proved to be very useful during the First World War. Both sides used the radio to relay messages to troops and top officials as well as to people not on the battle front.

In 1893, Nikolai Tesla made first public demonstration of radio by transmitting electromagnetic waves in St. Louis, Missouri.

In 1896, Guglielmo Marconi realized that it could be used for receiving and sending the telegraph messages, referring to it as wireless telegraphs. His earliest radio transmissions, the coded signals transmitted to only a mile far.  Then he filed for patent protection of his radio apparatus and moved to England for further experiments.

In 1899, the R.F. Matthews was the first ship to request emergency assistance using a wireless apparatus (Marconi’s system). The following year, he opened his radio factory in Essex, thus establishing link between France and Britain. He then established link with USA in year 1901.
In 1901, First trans-Atlantic signal sent-by Marconi from Ireland to Canada.

There has been tremendous growth of the radio over the years. Transmitters earlier were known as spark gap machines. It was established for ship-to-ship and ship-to-shore communication. The communication was just confined to two points then and was not public broadcasting as it is today. Wireless signals demonstrated effective communication for the purpose of rescue in case of sea disasters.

 In year 1905, Port Arthur’s naval battle was also reported over wireless and US weather department tried radiotelegraphy for speeding notice weather condition.Eventually radio transmitters were improved.

He shared Nobel Prize in Physics for wireless telegraph in year 1909.But his wireless telegraph only transmitted signals.

It was Reginald Fessenden who developed a way to combine sound and radio carrier waves. His first effort to transmit this mixed signal to a receiver removing  the carrier wave so that  the listener could hear the original sound got failed. However, in 1906, using Alexanderson’s Alternator, Fessenden made the first long range transmission of voice.
Lee DeForest is credited with the “father of American radio.” DeForest was a direct competitor to Marconi at the turn of the century (1899), when he was the chief scientist at the U.S.’s first radio firm ‘American Wireless Telephone
and Telegraph’.
 In 1906, Lee DeForest produces the “Audion,” a triode vacuum tube that allowed for amplification of radio signals.
Lee Deforest was inventor of space telegraphy. In early 1990s, delicate and effective detector of the electromagnetic radiation was needed for developing the radio further. And Lee Deforest discovered the detector. He was the first person to use term ‘radio’.

His work resulted in discovery of AM radio that capably broadcasted various radio stations which early gap transmitters did not allow.
Radio has now become a popular medium of portable entertainment.
Europe’s most famous broadcasting station, the British Broadcasting Company or BBC, began following in 1922. In fact, Marconi was one of the founding members along with other prominent leaders in the field of wireless manufacturers. Broadcasts began locally in London, but by 1925 it has spread to most of the United Kingdom. The station aired plays, classical music and variety programs. In 1927 the BBC became the British Broadcasting Corporation when it was granted it a Royal Charter. When the Second World War began all the television stations shut down and it fell on the shoulders of the radio to cover the war.

Edwin Armstrong—this WWI Army officer, Columbia University engineering professor, and creator of FM radio
invented the regenerative circuit, the first amplifying receiver and reliable continuous-wave transmitter; and the
superheterodyne circuit, a means of receiving, converting and amplifying weak, high-frequency electromagnetic
In 21stcentury, technological advancements have given birth to internet radio. Satellite radio is also recent development in the field. One can listen to various international radio stations without any hassles. Besides all these latest editions, Ham radio would be next big thing. This technology is gearing up to hit the market soon.