Wednesday, 30 December 2020

YouTube's History and Its Impact on the Internet


YouTube is one of the largest and most popular video distribution platforms on the Internet. It has more than 4 billion hours worth of video viewers every month, and an estimated 500 hours of video content are uploaded to YouTube every passing minute.

Since its origins in 2005, YouTube has transformed itself from a showcase for amateur videos to one that distributes original content.

It has also enabled the creation of an entirely new profession — YouTube content creator, which can be a very profitable career for some YouTubers around the world. 

What was the original purpose of YouTube?
YouTube was originally created as a platform for anyone to post any video content they desired. It was hoped that users could use the site to upload, share, and view content without restriction. 

It has since grown to become one of the foremost video distribution sites in the world. Today, many content creators make a decent living by selling ad space before or on videos they create and upload onto the site. 
Thanks to things like YouTube's Partner Program and Google's AdSense, a few people can actually create successful careers as YouTubers. 

YouTube was founded on Valentine's Day in 2005. It was the brainchild of Chad Hurley, Steve Chen, and Jawed Karim, who were all former employees of Paypal. 

The platform, like so many others in Silicon Valley, began as an angel-funded enterprise with makeshift offices in a garage.

history of youtube first video
Source: jawed/YouTube
According to its founders, the idea was born at a dinner party in San Fransisco, about a year earlier, in 2004. The trio was frustrated by how hard it was, at the time, to find and share video clips online. 

“Video, we felt, really wasn’t being addressed on the Internet,” said Chad Hurley in an early interview. “People were collecting video clips on their cell phones … but there was no easy way to share [them].”
In May of 2005, the beta version of YouTube was up on the net, and within a month, the very first video was posted. It was titled, "Me at the Zoo," and was a 19-second long clip posted by Karim himself. The video featured footage of Karim at the San Diego Zoo, talking about elephants and their trunks. 

By September of 2005, YouTube had managed to get its first video with one million views. This was a Nike ad that went and gone viral. 

This first YouTube viral video was a clip of Brazilian soccer player Ronaldinho receiving a pair of Golden Boots. Nike was also one of the first major companies to embrace YouTube's promotional potential.  
The following month, in November of 2005, the venture capital firm Sequoia Capital invested an impressive $3.5 million in the business. Roelof Botha (who also formerly worked for Paypal) joined YouTube's board of directors.Sequoia and Artis Capital Management invested an additional $8 million, in 2006, as the website saw significant growth in its first few months.

Who founded YouTube?
As previously mentioned, YouTube was founded by:

Chad Hurley
Steve Chen
Jawed Karim
All three were working for Paypal at the time of YouTube's founding in 2005. 

Chad Hurley studied design at the Indiana University of Pennsylvania prior to joining Ebay's PayPal division after graduating in 1999. At Paypal, he primarily focused on the user experience (UX) of their interface. 

Steven Shih Chen was born in 1978 in Taipei, Taiwan. His family emigrated to the U.S. when he was eight years old. Steve left the Illinois Mathematics and Science Academy prior to graduating.

He later attended the University of Illinois at Urbana-Champaign, where he graduated in 2002 with a degree in computer science. He would later join Paypal. 
Jawed Karim was born in 1979 in Merseburg, East Germany. His father was Bangladeshi, and his mother was German. 

After experiencing xenophobia in Germany, his father moved the family to Saint Paul, Minnesota, in 1992. Jawed would later study computer science at the University of Illinois Urbana-Champaign, but left prior to graduating.

After dropping out, Jawed would become an early employee of the fledgling Paypal. While at Paypal, he continued his coursework and eventually graduated with a degree in computer science, and went on to earn a master's degree in computer science from Stanford University. 

How was YouTube created?
Steve Chen, Chad Hurley, and Jawed Karim met at Paypal.

The concept of YouTube was inspired, according to Jawed Karim, by videos of Janet Jackson's wardrobe malfunction at the Super Bowl, and the devastating tsunami in the Indian Ocean. As a capital-funded startup, the idea for YouTube received an $11.5 million investment from Sequoia Capital in 2005. In February, the domain name was registered in the headquarters above a pizzeria in California. In April, the first-ever video was uploaded by Karim named "Me at the Zoo". After a Beta testing period, the site launched in December 2005, and a Nike commercial became the first video to receive one million views," according to Engadget. 

By February of 2004, YouTube's now-famous logo (since changed as of 2017) was registered as a trademark, and the website domain name was also purchased.

The original idea for YouTube was for users to be able to upload videos, introducing themselves, and saying what they were interested in. This didn't really take off, and the co-founders soon pivoted to a more general video sharing site. 
How did YouTube get its name?
Unlike a lot of other company names, YouTube's name is actually quite self-explanatory.

"The name “YouTube” is actually pretty straightforward. The “You” represents that the content is user-generated, created by individual users and not the site itself, and “Tube” is a nod toward an older original term for television. Soon after YouTube's URL was registered, it came under immediate attack by another company called Universal Tube & Rollform Equipment. Their website address just happened to be very similar —www.utube.com. They filed a lawsuit against YouTube, which appears to have been unsuccessful. Today that company's URL is www.utubeonline.com.

 Since its early days in 2005, YouTube has grown to become a behemoth of the Internet. It is now present in more than 75 countries and available in 61 languages, with hundreds of hours of video content uploaded every minute!

Today, the site has more than one billion users and has become the de facto video sharing platform on the Internet.  

Tuesday, 29 December 2020

The history of cakes

The History of Cakes



For those of us who share an affinity for sweets, cake probably ‘takes the cake’ as our favorite dessert ever. It’s the one treat most commonly associated with momentous celebrations, and it can even manage to evoke nostalgia. Not to mention, a flavor profile exists for practically every taste, even those who don’t like chocolate (although we have to respectfully agree to disagree here). But, what you may not know is that cake has a history that is as rich and detailed as those exquisite cakes we see on TV and in our own homes. Let’s enhance our cake trivia and indulge in some history of cakes. 

The First Cakes

The word cake is of Viking origin, derived from the Norse word “kaka.” The first cakes ever made are actually quite different from the ones we eat today. Interestingly, the ancient Egyptians were the first culture to exhibit baking skills, and during Ancient Times the cakes were more bread-like in appearance and sweetened with honey. The Greeks also had an early form of cheesecake, while the Romans developed versions of fruitcakes with raisins, nuts and other fruits.

Meanwhile in mid-17th century Europe, cakes were frequently baked as a result of advances in technology and access to ingredients. Europe is credited with the invention of modern cakes, which were round and topped with icing. Incidentally, the first icing was usually a boiled mixture of sugar, egg whites, and some flavorings. During this time, many cakes still contained dried fruits, like currants and citrons.

Then, in the 19th century, cake, as we know it today, became more popular. However, the treat was considered a luxury as sweet ingredients like sugar and chocolate were very expensive. During this time, cakes were baked with extra refined white flour and baking powder instead of yeast. Buttercream frostings also began replacing traditionally boiled icings. Also, thanks to the advancements in temperature controlled ovens, a baker’s life became much easier. No longer did the bakers have to continually watch and wait for the cake to finish baking. Even more, the Industrial Revolution made ingredients more readily available, which made them cheaper, so more people could bake them or even buy them at the store.

The Birthday Cake
history of cakes 3
Now, we can’t talk about the history of cakes without mentioning birthday cakes! Today, cake is obviously used to celebrate occasions, like weddings, engagements, anniversaries, holidays, and of course birthdays. But, when did we actually start celebrating birthdays with cake, and why? Notably, in Ancient Greece, it was tradition to celebrate the births of their gods. And, for the celebration of goddess Artemis’ birth, people would bake a round cake in her honor, to symbolize the moon. Theories suggest that the cake was decorated with lit candles so it would glow like the moon.

Then, by the 13th century, German children began celebrating their birthdays (called Kinderfest) with cakes that were also lit with candles. Candles stood for the light of life, with one candle for each year and one additional candle for continued life. However, unlike today, the candles burned all day and were often replaced when the flame died down. Finally, before the cake was eaten, the candles were blown out, and the child would make a wish. The belief was that the smoke would carry the wish to heaven. And, like modern tradition, the birthday girl or boy wouldn’t tell anyone their wish so it would come true.

Why are Cakes Round?
Although cakes can be baked in virtually any shape imaginable, there are several theories as to why most cakes are traditionally round in shape. Generally, the round cakes we enjoy today were made by hand and molded into round balls. While baking, the bread naturally relaxed into rounded shapes. Now, we often use hoops and pans to create the distinctive circular shape of a cake.

Although, there is another theory too, which is that Gods prefer round cakes. In Ancient Times, some civilizations baked cakes as a kind gesture for their gods and spirits. A round cake was meant to symbolize the cyclical nature of life, as well as the sun and the moon. Incidentally, this theory could explain why we serve cakes at special occasions like birthdays, to symbolize the cycle of life.

We don’t know about you, but all of that history of cakes really has us craving a piece. Luckily, we don’t know a better baker than our pastry chef, Natalie. Take a look at Natalie’s quick-motion baking videos on Facebook to see her in action. And, be sure to check out the Bakery Menu to taste Natalie’s creations.


 

Monday, 28 December 2020

About Kalinga War





When Ashoka, the son of the Mauryan emperor Bindusara and the grandson of Chandragupta Maurya, ascended the throne of Magadha in 273 B.C. treading in the footsteps of his forefathers he set out to expand his empire. In the 12th year of his reign, he sent a message to Kalinga asking its submission, but the Kalingaraj refused to submit to the Mauryan empire.

As a result Ashoka lead a huge army against Kalinga. This took place in 261 B.C., the freedom loving people of Kalinga offered a stiff resistance to the Mauryan army. The whole of Kalinga turned into a battle arena. History offers us but few examples of such fiercely fought wars as this. The Kalingaraj himself commanded his army in the battle field. However, the limited forces of Kalinga were no match for the overwhelming Magadha army. Contrary to Ashoka's expectations, the people of Kalinga fought with such great valor that on number of occasions they came very close to a victory. The soldiers of Kalinga perished in the battlefield fighting till their last breath for their independence. The victory ultimately rested with Ashoka.

The war took a tremendous toll of life and property. The 13th rock edict of Ashoka throws light on this war. Atleast 0.1 million Kalingans were killed while another 0.15 million were taken prisoners. And almost equal number of Magadha soldiers were also killed. There was not a single man left in Kalinga to live a life of slavery.

This is the singular instance of a war in history which brought about a complete change of heart in a stern ruler like Ashoka. The scene of the war presented a horrible sight, the whole terrain was covered with the corpses of soldiers, wounded soldiers groaned in severe pain, vultures hovered over their dead bodies, orphaned children mourning the loss of their nears and dears, widows looked blank and despaired.

This sight overwhelmed Ashoka. He realized that his victory at such a cost is not worthwhile. The whole war resulted in Ashoka's deviation towards Buddhism and after two and a half years he became an ardent follower of Buddhism under Acharya Upgupta. Apart from the metropolitan area, which was directly governed, the empire was divided into four provinces each under a prince or member of the royal family whose official status was that of a viceroy. Governors administering smaller units were selected from amongst the local people. The provincial ministers were powerful and could act as a check on the viceroy, and were on various occasions effective rulers. Ashoka sent inspectors on tour every five years for an additional audit and check on provincial administration. There were specially appointed judicial officers both in the cities and in the rural areas. Fines served as punishments in most cases. But certain crimes were considered too serious to be punished by fines alone, and capital punishments were delivered.

Each province was sub-divided into districts, each of these into groups of villages, and the final unit of administration was the village. The group of villages was staffed with an accountant, who maintained boundaries, registered land and deeds, kept a census of the population and a record of the livestock; and the tax collector, who was concerned with the various types of revenue. Each village had its own officials, such as the headman, who was responsible to the accountant and the tax-collector. Officers at this level in rural administration were paid either by a remission of tax or by land grants.

Urban administration had its own hierarchy of officers. The city superintendent maintained law and order and the general cleanliness of the city. Cities were generally built of wood, necessitating the maintaining of fire precautions. The city superintendent was assisted by an accountant and a tax collector. Megasthenes has described the administration of Pataliputra in detail. The city was administered by thirty officials, divided into six committees of five. Each committee supervised one of the following functions: questions relating to industrial arts, the welfare of foreigners, the registering of births and deaths, matters relating to trade & commerce, supervision of the public sale of manufactured goods, and, finally, collection of the tax on articles sold.

Two of the key offices controlled by the central administration were those of the Treasurer and the chief collector. The Treasurer was responsible for keeping an account of the income in cash and for storing the income in kind. The Chief Collector assisted by a body of clerks, kept records of the taxes which came in from various parts of the empire. The accounts of every administrative department were properly kept and were presented jointly by all the ministers to the king, perhaps to avoid fraud and embezzlement. Each department had a large staff of superintendents and subordinate officers. The superintendents worked at local center and were a link between local administration and the central government. Those specifically listed in the Arthashastra are the superintendents of gold and goldsmiths, and of the storehouse, commerce, forest produce, the armoury, weights and measures, tools, weaving, agriculture, liquor , slaughter houses, prostitutes, ships, cows, horses, elephants, chariots, infantry, passports and the city.

Salaries of officials and expenditure on public works constituted a sizeable portion of the national expenses, one quarter of the total revenue being reserved for these. The higher officials were extremely well paid and this must have been a drain on the treasury. The chief minister, the purohita and the army commander received 48,000 panas, the treasurer and the chief collector 24,000 panas; the accountants and clerks received 500 panas, whereas the ministers were paid 12,000 panas; and artisans received 120 panas. The value of the pana is not indicated, nor the intervals at which the salaries were paid. 

Coming with part 2........................

Sunday, 27 December 2020

Japan, China, the United States and the Road to Pearl Harbor, 1937–41

Between 1937 and 1941, escalating conflict between China and Japan influenced U.S. relations with both nations and ultimately contributed to pushing the United States toward full-scale war with Japan and Germany. 
Photograph of the Marco Polo Bridge Incident

At the outset, U.S. officials viewed developments in China with ambivalence. On the one hand, they opposed Japanese incursions into northeast China and the rise of Japanese militarism in the area, in part because of their sense of a longstanding friendship with China. On the other hand, most U.S. officials believed that it had no vital interests in China worth going to war over with Japan. Moreover, the domestic conflict between Chinese Nationalists and Communists left U.S. policymakers uncertain of success in aiding such an internally divided nation. As a result, few U.S. officials recommended taking a strong stance prior to 1937, and so the United States did little to help China for fear of provoking Japan. U.S. likelihood of providing aid to China increased after July 7, 1937, when Chinese and Japanese forces clashed on the Marco Polo Bridge near Beijing, throwing the two nations into a full-scale war. As the United States watched Japanese forces sweep down the coast and then into the capital of Nanjing, popular opinion swung firmly in favor of the Chinese. Tensions with Japan rose when the Japanese Army bombed the U.S.S. Panay as it evacuated American citizens from Nanjing, killing three. The U.S. Government, however, continued to avoid conflict and accepted an apology and indemnity from the Japanese. An uneasy truce held between the two nations into 1940.

FDR signing Lend-Lease

In 1940 and 1941, President Franklin D. Roosevelt formalized U.S. aid to China. The U.S. Government extended credits to the Chinese Government for the purchase of war supplies, as it slowly began to tighten restrictions on Japan. The United States was the main supplier of the oil, steel, iron, and other commodities needed by the Japanese military as it became bogged down by Chinese resistance but, in January, 1940, Japan abrogated the existing treaty of commerce with the United States. Although this did not lead to an immediate embargo, it meant that the Roosevelt Administration could now restrict the flow of military supplies into Japan and use this as leverage to force Japan to halt its aggression in China.

After January 1940, the United States combined a strategy of increasing aid to China through larger credits and the Lend-Lease program with a gradual move towards an embargo on the trade of all militarily useful items with Japan. The Japanese Government made several decisions during these two years that exacerbated the situation. Unable or unwilling to control the military, Japan’s political leaders sought greater security by establishing the “Greater East Asia Co-Prosperity Sphere” in August, 1940. In so doing they announced Japan’s intention to drive the Western imperialist nations from Asia. However, this Japanese-led project aimed to enhance Japan’s economic and material wealth so that it would not be dependent upon supplies from the West, and not to “liberate” the long-subject peoples of Asia. In fact, Japan would have to launch a campaign of military conquest and rule, and did not intend to pull out of China. At the same time, several pacts with Western nations only made Japan appear more of a threat to the United States. First, Japan signed the Tripartite Pact with Germany and Italy on September 27, 1940 and thereby linked the conflicts in Europe and Asia. This made China a potential ally in the global fight against fascism. Then in mid-1941, Japan signed a Neutrality Pact with the Soviet Union, making it clear that Japan’s military would be moving into Southeast Asia, where the United States had greater interests. A third agreement with Vichy France enabled Japanese forces to move into Indochina and begin their Southern Advance. The United States responded to this growing threat by temporarily halting negotiations with Japanese diplomats, instituting a full embargo on exports to Japan, freezing Japanese assets in U.S. banks, and sending supplies into China along the Burma Road. Although negotiations restarted after the United States increasingly enforced an embargo against Japan, they made little headway. Diplomats in Washington came close to agreements on a couple of occasions, but pro-Chinese sentiments in the United States made it difficult to reach any resolution that would not involve a Japanese withdrawal from China, and such a condition was unacceptable to Japan’s military leaders.

Faced with serious shortages as a result of the embargo, unable to retreat, and convinced that the U.S. officials opposed further negotiations, Japan’s leaders came to the conclusion that they had to act swiftly. For their part, U.S. leaders had not given up on a negotiated settlement, and also doubted that Japan had the military strength to attack the U.S. territory. Therefore they were stunned when the unthinkable happened and Japanese planes bombed the U.S. fleet at Pearl Harbor on December 7, 1941. The following day, the United States declared war on Japan, and it soon entered into a military alliance with China. When Germany stood by its ally and declared war on the United States, the Roosevelt Administration faced war in both Europe and Asia.

Saturday, 26 December 2020

When was smartphone invented?




The average smartphone owner touches their phone 2,617 times a day. Things would be a lot different if it weren’t for the first-ever smartphone, Simon.

The smartphone turns 25 tomorrow. Here’s how it grew into what we know it as today.

When was the first smartphone invented?
The first smartphone was invented in 1992, 25 years ago.

Created by IBM, the Simon Personal Communicator was truly revolution. It was the first phone to combine the functions of a cell phone, i.e you could make calls, and a PDA, which back then was a handheld device you could use for emails and to send faxes.

You could even receive pages, remember those? 
The first smartphone, Simon, made by IBM
Even back then, smartphones were expensive. With a contract, the price of Simon was around $899, which is about $1,435 in today’s money. And you thought the iPhone X was expensive (though its battery reportedly only lasted one hour).

Around 50,000 Simons were sold in the early 90s, which may not seem like a lot but it was the first ever smartphone. The legacy of the early Simon lives on in every iPhone or Android smartphone you buy today.
The first Blackberry – released in 1999
Blackberry was next on the smartphone bandwagon, releasing its device in 1999. This was the company’s first portable email device, so technically not a smartphone.

That came a few years later in 2002, with the release of the Blackberry 5810 which could make calls too.

Blackberry was the company that pioneered the full QWERTY keyboard, soon to be replaced with touch screens. RIP.
The first iPhone – released in 2007
The first iPhone device is considered to be the time that smartphones hit the mainstream. Released in June 2007 and announced by Steve Jobs, it revolutionised the mobile market.

The first iPhone was only £470 for an 8GB model. It didn’t even have 3G capability. Around 6m of the devices were sold, before the arrival of the iPhone 3G which saw the original discontinued. 
The first App Store – released in 2008
The iPhone truly became a smartphone when you could customise the software on it, which you could in July 2008 with the release of the first App Store.

First pioneered by Apple, with the iOS App Store, the term was later adopted by Google’s Android when it began releasing smartphones.

It was so popular, there were over 10m downloads of apps in the first weekend of its release. At the time, Jobs called it a “grand slam”.

In the first quarter of 2016 alone, 17.2bn apps were downloaded on iOS and Android devices.

The first Android phone – released in 2008
Android is the dominating operating system in the smartphone market, reaching over 2bn monthly active users in 2017. But it wasn’t always this way, with its first smartphone released a whole year after Apple’s iPhone.

The first Android device was the HTC Dream or the T-Mobile G1 if you were in the US. It had a sliding keyboard and had a multi-touch screen, though this wasn’t supported on all devices, 3G, Wi-Fi and a three-megapixel camera.

What does the smartphone market look like now?
According to analysis company Statista, the number of smartphones users in the world is expected to reach 2.71bn by 2019.

This is still only 36 percent of the world’s population, demonstrating that smartphones still have a long way to go. 

Friday, 25 December 2020

Rare 'great conjunction' of Jupiter and Saturn wows skywatchers around the world


For the last several days, skywatchers have been captivated by the Great Conjunction between Jupiter and Saturn. The closest pass took place on Monday, December 21, but the spectacle itself began days earlier — and will last until at least Christmas Day. 

What’s great conjunction?
A pairing between any pair of planets in conjunction. Jupiter and Saturn are the two largest planets visible to the naked eye, hence the expression ‘Great Conjunction’. These two align roughly every 20 years, which is relatively rare compared to the alignments of planets closer to the Sun (and which consequently have shorter orbits). 
Jupiter orbits the Sun once in 12 years, and Saturn once in 30. High school arithmetic tells us that in 60 more years (the LCM of 12 and 30), i.e. in 2080, the two planets will align at roughly the same place where stargazers watched them on December 21, 2020. In these 60 years, Jupiter will have orbited the Sun five times, while Saturn will have done so twice.

But they will have met twice more during this period, though at different places in the sky. In 12 years more, Jupiter will return to its current place; in the next 8 years, it will complete 2/3rds of another 12-year cycle around the Sun. In the same 20 years, Saturn will have completed 2/3rds of its 30-year cycle. In other words, the two planets will meet again in 2040. And yet again in 2060. 

So, why is this conjunction special?

It’s the alignment. We measure the position of a planet in terms of the angle it makes on the Earth’s orbital plane, with a given reference direction. When we say two planets have aligned in a conjunction, it suggests they are casting the same angle with that reference direction.

In fact, this is almost never the case. Planets in a conjunction are typically above or below each other, because their orbits are slightly tilted with respect to each other.


This time, Jupiter and Saturn are a tenth of a degree apart viewed from Earth. From some views, that might give them the appearance of converging into one, but viewers around the world have found them distinct enough to tell them apart.

Also, the position of Earth matters. Not every alignment provides a clear viewing.

And how rare is this conjunction?


The last Great Conjunction happened in 1623. For context, Galileo had discovered four of Jupiter’s moons with his telescope a few years previously — but scientists today believe Galileo would not have found it easy to see the conjunction, because the planets were aligned too close to the Sun from Earth’s perspective. From an Indian context, Jahangir was ruling the Mughal empire at the time, and the Maratha warrior king Chhatrapati Shivaji was yet to be bor


The last time the two planets were close enough to be viewed in the night sky was in 1226. This was just a year before the death of the Mongol ruler Genghis Khan.

Tuesday, 22 December 2020

How was zoom app invented.

 ZOOM APP

Zoom was founded by Eric Yuan, a former corporate vice president for Cisco Webex. He left Cisco in April 2011 with 40 engineers to start a new company, originally named Saasbee, Inc. The company had trouble finding investors because many people thought the video telephony market was already saturated. In June 2011, the company raised $3 million of seed money from WebEx founder Subrah Iyar, former Cisco SVP and General Counsel Dan Scheinman, and venture capitalists Matt Ocko, TSVC, and Bill Tai.

In May 2012, the company changed its name to Zoom, influenced by Thacher Hurd's children's book Zoom City In September 2012, Zoom launched a beta version that could host conferences with up to 15 video participants  In November 2012, the company signed Stanford University as its first customer. The service was launched in January 2013 after the company raised a $6 million Series A round from Qualcomm Ventures, Yahoo! founder Jerry Yang, WebEx founder Subrah Iyar, and former Cisco SVP and General Counsel Dan Scheinman. Zoom launched version 1.0 of the program allowing the maximum number of participants per conference to be 25. By the end of its first month, Zoom had 400,000 users and by May 2013 it had 1 million users. In July 2013, Zoom established partnerships with B2B collaboration software providers, such as Redbooth (then Teambox),  and also created a program named Works with Zoom, which established partnerships with Logitech, Vaddio, and InFocus. In September 2013, the company raised $6.5 million in a Series B round from Horizon Ventures and existing investors. At that time, it had 3 million users. In April 2020, the app's CEO, Eric Yuan, announced Zoom's daily users have ballooned to more than 200 million.

On February 4, 2015, the company received US$30 million in Series C funding from investors including Emergence Capital, Horizons Ventures (Li Ka-Shing), Qualcomm Ventures, Jerry Yang, and Patrick Soon-Shiong. At that time, Zoom had 40 million users, with 65,000 organizations subscribed and a total of 1 billion meeting minutes since it was established. Over the course of 2015 and 2016, the company integrated its software with Slack, Salesforce, and Skype for Business  With version 2.5 in October 2015, Zoom increased the maximum number of participants allowed per conference to 50  and later to 1,000 for business customers. In November 2015, former president of RingCentral David Berman was named president of the company, and Peter Gassner, the founder, and CEO of Veeva Systems joined Zoom's board of directors.

In January 2017, the company raised US$100 million in Series D funding from Sequoia Capital at a US$1 billion valuation, making it a so-called unicorn In April 2017, Zoom launched a scalable telehealth product allowing doctors to host remote consultations with patients. In May, Zoom announced integration with Polycom's conferencing systems, enabling features such as multiple screens and device meetings, HD and wireless screen sharing, and calendar integration with Microsoft Outlook, Google Calendar, and iCal. From September 25-27, 2017, Zoom hosted Zoomtopia 2017, its first annual user conference. At this conference, Zoom announced a partnership with Meta to integrate Zoom with augmented reality, integration with Slack and Workplace by Facebook, and first steps towards an artificial intelligence speech recognition program.


♥️♥️♥️

TO BE CONTINUED.......

Monday, 21 December 2020

About Wifi

WIFI 


In 1971, ALOHAnet connected the Great Hawaiian Islands with a UHF wireless packet network. ALOHAnet and the ALOHA protocol were early forerunners to Ethernet, and later the IEEE 802.11 protocols, respectively.

A 1985 ruling by the U.S. Federal Communications Commission released the ISM band for unlicensed use. These frequency bands are the same ones used by equipment such as microwave ovens and are subject to interference.

The technical birthplace of Wi-Fi is The Netherlands. In 1991, NCR Corporation with AT&T Corporation invented the precursor to 802.11, intended for use in cashier systems, under the name WaveLAN. NCR's Vic Hayes, who held the chair of IEEE 802.11 for 10 years, along with Bell Labs Engineer Bruce Tuch, approached IEEE to create a standard and were involved in designing the initial 802.11b and 802.11a standards within the IEEE... They have both been subsequently inducted into the Wi-Fi NOW Hall of Fame. The first version of the 802.11 protocol was released in 1997 and provided up to 2 Mbit/s link speeds. This was updated in 1999 with 802.11b to permit 11 Mbit/s link speeds, and this proved popular.

In 1999, the Wi-Fi Alliance formed as a trade association to hold the Wi-Fi trademark under which most products are sold.

The major commercial breakthrough came with Apple Inc. adopting Wi-Fi for their iBook series of laptops in 1999. It was the first mass consumer product to offer Wi-Fi network connectivity, which was then branded by Apple as AirPort. This was in collaboration with the same group that helped create the standard Vic Hayes, Bruce Tuch, Cees Links, Rich McGinn, and others from Lucent 

Wi-Fi uses a large number of patents held by many different organizations in April 2009, 14 technology companies agreed to pay CSIRO $1 billion for infringements on CSIRO patents. This led to Australia labeling Wi-Fi as an Australian invention, though this has been the subject of some controversy. CSIRO won a further $220 million settlement for Wi-Fi patent-infringements in 2012, with global firms in the United States required to pay CSIRO licensing rights estimated at an additional $1 billion in royalties. In 2016, the wireless local area network Test Bed was chosen as Australia's contribution to the exhibition A History of the World in 100 Objects held in the National Museum of Australia.




Etymology and terminology

The name Wi-Fi, commercially used at least as early as August 1999, was coined by the brand-consulting firm Interbrand. The Wi-Fi Alliance had hired Interbrand to create a name that was "a little catchier than 'IEEE 802.11b Direct Sequence'. Phil Belanger, a founding member of the Wi-Fi Alliance, has stated that the term Wi-Fi was chosen from a list of ten potential names invented by Interbrand.

The name Wi-Fi has no further meaning, and was never officially a shortened form of "Wireless Fidelity". Nevertheless, the Wi-Fi Alliance used the advertising slogan "The Standard for Wireless Fidelity" for a short time after the brand name was created, and the Wi-Fi Alliance was also called the "Wireless Fidelity Alliance Inc" in some publications.

Interbrand also created the Wi-Fi logo. The yin-yang Wi-Fi logo indicates the certification of a product for interoperability.

Certification

The IEEE does not test equipment for compliance with its standards. The non-profit Wi-Fi Alliance was formed in 1999 to fill this void—to establish and enforce standards for interoperability and backward compatibility, and to promote wireless local-area-network technology. As of 2017, the Wi-Fi Alliance includes more than 800 companies. It includes 3Com (now owned by HPE/Hewlett-Packard Enterprise), Aironet (now owned by Cisco), Harris Semiconductor (now owned by Intersil), Lucent (now owned by Nokia), Nokia, and Symbol Technologies (now owned by Zebra Technologies). The Wi-Fi Alliance enforces the use of the Wi-Fi brand to technologies based on the IEEE 802.11 standards IEEE. This includes wireless local area network (WLAN) connections, a device to device connectivity (such as Wi-Fi Peer to Peer aka Wi-Fi Direct), Personal area network (PAN), local area network (LAN), and even some limited wide-area network (WAN) connections. Manufacturers with membership in the Wi-Fi Alliance, whose products pass the certification process, gain the right to mark those products with the Wi-Fi logo.


CITYWIDE

In the early 2000s, many cities around the world announced plans to construct citywide Wi-Fi networks. There are many successful examples; in 2004, Mysore (Mysuru) became India's first Wi-Fi-enabled city. A company called WiFiyNet has set up hotspots in Mysore, covering the complete city and a few nearby villages.

In 2005, St. Cloud, Florida, and Sunnyvale, California, became the first cities in the United States to offer citywide free Wi-Fi (from MetroFi). Minneapolis has generated $1.2 million in profit annually for its provider.

In May 2010, the then London mayor Boris Johnson pledged to have London-wide Wi-Fi by 2012. Several boroughs including Westminster and Islington  already had extensive outdoor Wi-Fi coverage at that point.

Officials in South Korea's capital Seoul are moving to provide free Internet access at more than 10,000 locations around the city, including outdoor public spaces, major streets, and densely populated residential areas. Seoul will grant leases to KT, LG Telecom, and SK Telecom. The companies will invest $44 million in the project, which was to be completed in 2015.


I HOPE THIS BLOG SEEMS TO BE USEFUL. PLEASE SHARE AND COMMENT DOWN BELOW.

THANKYOU

AHONA SARKAR...



Sunday, 20 December 2020

who invented whatsapp and why?

 WHATSAPP

Whatsapp is the most used app in 2020 app used in 2020 as it helps to connect with our near and dear ones.

 Jan Koum, co-founder and CEO of Facebook’s WhatsApp messaging service says the idea for the company he co-founded with Brian Acton in 2009 came about so he could stop missing calls on his new smartphone.

“It started with me buying an iPhone,” Koum told an audience of several hundred Silicon Valley veterans gathered for an event this week at the Computer History Museum in Mountain View, California. “I got annoyed that I was missing calls when I went to the gym.”

He and Acton then built an app that could let their friends know whether or not they were available, thanks to an easy-to-use feature called “Status.”

“We didn’t set out to build a company. We just wanted to build a product that people used,” Koum said late Wednesday, during an onstage panel discussion that preceded an advance screening of a new documentary called “Silicon Valley: The Untold Story.”

The app didn’t take off right away, even though it was accepted into Apple’s App Store, Koum recounted.

“We were so excited when it launched,” he said. “And so disappointed when no one used it.”

That soon changed, however.

By 2014, WhatsApp, thanks to its easy-to-use interface and uncluttered design, had more than 400 million users globally.

Corporate suitors like Facebook soon came calling.

‘All a blur’

When panel moderator Michael Malone asked Koum what he remembered most about the day he agreed to sell the company in early 2014, Koum drew a blank.

“It was all a blur. I don’t remember any of that except being in a room with lawyers for three days straight,” he said.

Ultimately, Facebook agreed to pay more than $19 billion to acquire WhatsApp, turning both of its founders into billionaires.

Last year Koum and WhatsApp reached 1.3 billion monthly users.

Malone asked him why he still goes to work since it’s like Koum “won the lottery.”

“We still have a lot of people who don’t use our product. We want to convince them,” Koum replied. “We still have problems to solve.”

When asked by CNBC after the panel what it was like since Acton left the company last year, Koum answered, 

“We miss Brian.”


Thursday, 17 December 2020

Are there really ghosts?

Are there really ghosts?




If you believe in ghosts, you're not alone. Cultures all around the world believe in spirits that survive death to live in another realm. In fact, ghosts are among the most widely believed paranormal phenomenon: Millions of people are interested in ghosts, and a 2013 Harris Poll found that 43% of Americans believe in ghosts.

The idea that the dead remain with us in spirit is an ancient one, appearing in countless stories, from the Bible to "Macbeth." It even spawned a folklore genre: ghost stories. Belief in ghosts is part of a larger web of related paranormal beliefs, including near-death experiences, life after death, and spirit communication. The belief offers many people comfort — who doesn't want to believe that our beloved but deceased family members aren't looking out for us, or with us in our times of need? 

People have tried to (or claimed to) communicate with spirits for ages; in Victorian England, for example, it was fashionable for upper-crust ladies to hold séances in their parlors after tea and crumpets with friends. Ghost clubs dedicated to searching for ghostly evidence formed at prestigious universities, including Cambridge and Oxford, and in 1882 the most prominent organization, the Society for Psychical Research, was established. A woman named Eleanor Sidgwick was an investigator (and later president) of that group and could be considered the original female ghostbuster. In America during the late 1800s, many psychic mediums claimed to speak to the dead — but were later exposed as frauds by skeptical investigators such as Harry Houdini. 

It wasn't until recently that ghost hunting became a widespread interest around the world. Much of this is due to the hit Syfy cable TV series "Ghost Hunters," now in its second decade of not finding good evidence for ghosts. The show spawned dozens of spinoffs and imitators, and it's not hard to see why the show is so popular: the premise is that anyone can look for ghosts. The two original stars were ordinary guys (plumbers, in fact) who decided to look for evidence of spirits. Their message: You don't need to be an egghead scientist, or even have any training in science or investigation. All you need is some free time, a dark place, and maybe a few gadgets from an electronics store. If you look long enough any unexplained light or noise might be evidence of ghosts.

The science and logic of ghosts

One difficulty in scientifically evaluating ghosts is that a surprisingly wide variety of phenomena are attributed to ghosts, from a door closing on its own, to missing keys, to a cold area in a hallway, to a vision of a dead relative. When sociologists Dennis and Michele Waskul interviewed ghost experiencers for their 2016 book "Ghostly Encounters: The Hauntings of Everyday Life" (Temple University Pressthey found that "many participants were not sure that they had encountered a ghost and remained uncertain that such phenomena were even possible, simply because they did not see something that approximated the conventional image of a 'ghost.' Instead, many of our respondents were simply convinced that they had experienced something uncanny — something inexplicable, extraordinary, mysterious, or eerie." Thus, many people who go on record as claiming to have had a ghostly experience didn't necessarily see anything that most people would recognize as a classic "ghost," and in fact, they may have had completely different experiences whose only common factor is that it could not be readily explained. 


DID YOU KNOW??

GHOSTS FORM IN CASE A PERSON CAN RECOGNIZE THAT HE IS GOING TO LEAVE THE EARTH AND HIS MANY DREAMS IS STILL NOT FULFILLED. SO PEOPLE DO Śrāddha.






to be continued.........................




How was refrigerator invented?

 HOW WAS REFRIGERATOR INVENTED?




refrigerator (colloquially fridge) is a home appliance consisting of a thermally insulated compartment and a heat pump (mechanical, electronic, or chemical) that transfers heat from its inside to its external environment so that it's inside is cooled to a temperature below the room temperature. Refrigeration is an essential food storage technique in developed countries. The lower temperature lowers the reproduction rate of bacteria, so the refrigerator reduces the rate of spoilage. A refrigerator maintains a temperature a few degrees above the freezing point of water. The optimum temperature range for perishable food storage is 3 to 5 °C (37 to 41 °F). A similar device that maintains a temperature below the freezing point of water is called a freezer. The refrigerator replaced the icebox, which had been a common household appliance for almost a century and a half.


The first cooling system for food involved ice. Artificial refrigeration began in the mid-1750s and developed in the early 1800s. In 1834, the first working vapor-compression refrigeration system was built. The first commercial ice-making machine was invented in 1854. In 1913, refrigerators for home use were invented. In 1923 Frigidaire introduced the first self-contained unit. The introduction of Freon in the 1920s expanded the refrigerator market during the 1930s. Home freezers as separate compartments (larger than necessary just for ice cubes) were introduced in 1940. Frozen foods, previously a luxury item, became commonplace.



Freezer units are used in households as well as in industry and commerce. Commercial refrigerator and freezer units were in use for almost 40 years before the common home models. The freezer-over-refrigerator style had been the basic style since the 1940s until modern, side-by-side refrigerators broke the trend. A vapor compression cycle is used in most household refrigerators, refrigerator-freezers, and freezers. Newer refrigerators may include automatic defrosting, chilled water, and ice from a dispenser in the door.



Domestic refrigerators and freezers for food storage are made in a range of sizes. Among the smallest are Peltier-type refrigerators designed to chill beverages. A large domestic refrigerator stands as tall as a person and maybe about 1 m wide with a capacity of 600 L. Refrigerators and freezers may be free-standing, or built into a kitchen. The refrigerator allows the modern household to keep food fresh for longer than before. Freezers allow people to buy food in bulk and eat it at leisure, and bulk purchases save money.


DID YOU KNOW?

1834. American inventor Jacob Perkins, living in London at the time, built the world's first working vapor-compression refrigeration system, using either in a closed cycle. His prototype system worked and was the first step to modern refrigerators, but it didn't succeed commercially.



Wednesday, 16 December 2020

GLOBAL WARMING

GLOBAL WARNING



GLOBAL WARNING IS AN ACTION OF LONG TERM HEATING OF EARTH'S CLIMATE SYSTEM, OBSERVED SINCE 1850.
 Global warming is a term used for the observed century-scale rise in the average temperature of the Earth's climate system and its related effects. Scientists are more than 95% certain that nearly all of the global warming is caused by increasing concentrations of greenhouse gases (GHGs) and other human-caused emissions.

IT CAUSES FOR THE HUMAN ACTIVITIES THAT PRODUCES A LOT OF GAS AND CREATES POLLUTION, LIKE...

1) Greenhouse Gases Are the Main Reasons for Global Warming.

2)   PRIMARY FOSSIL FUEL BURNING

3) BURNING FIRECRACKERS

4)   DRIVING CAR, SCOOTER AND BIKE FOR SHORT DISTANCE JOURNEY

5)    Variations in the Sun's Intensity. 

6)    INDUSTRIAL, AGRICULTURE ACTIVITY

7)   DEFORESTATION 

WHAT IS GLOBAL WARNING AND IT'S effects?

Global warming, the gradual heating of Earth's surface, oceans, and atmosphere, is caused by human activity, primarily the burning of fossil fuels that pump carbon dioxide (CO2), methane, and other greenhouse gases into the atmosphere. Despite political controversy about climate change, a major report released Sept.


HOW SERIOUS IS GLOBAL WARMING?

Higher temperatures are worsening many types of disasters, including storms, heatwaves, floods, and droughts. A warmer climate creates an atmosphere that can collect, retain, and drop more water, changing weather patterns in such a way that wet areas become wetter and dry areas drier. Global warming causes climate change, which poses a serious threat to life on earth in the forms of widespread flooding and extreme weather.


HOW DID IT START?

Scientists generally regard the later part of the 19th century as the point at which human activity started influencing the climate. But the new study brings that date forward to the 1830s.

NOW, IT IS ALSO SAID THAT GLOBAL WARMING IS GOOD...

HERE'S THE INFORMATION! 

Global Warming is important since it helps determine future climate expectations. Through the use of latitude, one can determine the likelihood of snow and hail reaching the surface. You can also be able to identify the thermal energy from the sun that is accessible to a region. It also refers to the sea-level rise caused by the expansion of warmer seas and melting ice sheets and glaciers. 

AFTER MANY pieces of research IT IS SAID THAT GLOBAL WARMING IS GOOD AND BAD TOO.

IT IS 99% BAD AND 1 % GOOD.

SO it's BETTER  TO CHANGE SOME OF OUR ACTIVITIES THAT WILL SURELY HELP OUR NATURE.


1. Speak up!

2. Power your home with renewable energy.

3. Weatherize, weatherize, weatherize.

4. Invest in energy-efficient appliances.

5. Reduce water waste.

6. Actually eat the food you buy—and make less of it meat.

7. Buy better bulbs.

8. Pull the plug(s).

9. Drive a fuel-efficient vehicle.

10. Maintain your ride.

11. Rethink planes, trains, and automobiles.

12. Shrink your carbon profile.


 
WHAT WILL HAPPEN IF GLOBAL WARMING IS NOT REDUCED?

Heatwaves will become more frequent and severe around the world, affecting hundreds of millions—or even billions—of people if we don't act.

WHY WE SHOULD STOP GLOBAL WARMING?

 Warmer climate increases public health challenges like heat aggravated illnesses, increases in vector-borne diseases, and decreased access to safe water and food. Cutting short-lived climate pollutants can slow the rate of warming and lower public health risks.

I HOPE YOU HAVE  FOUND THIS BLOG HELPFUL.

COMING WITH MORE SUCH BLOGS, ON THE WAY!

THANKYOU












The Secret of How the Titanic Sank

  The Secret of How the Titanic Sank New evidence has experts rethinking how the luxury passenger liner sank. FOR DECADES AFTER THE  disaste...