Engineering prowess famously catapulted the 14-year-old search giant
into its place as one of the world’s most successful, influential, and
frighteningly powerful companies. Its constantly refined search
algorithm changed the way we all access and even think about
information. Its equally complex ad-auction platform is a perpetual
money-minting machine. But other, less well-known engineering and
strategic breakthroughs are arguably just as crucial to Google’s
success: its ability to build, organize, and operate a huge network of
servers and fiber-optic cables with an efficiency and speed that rocks
physics on its heels.A smooth and Glossy floor tile
not only looks bright and clean, Google has spread its infrastructure
across a global archipelago of massive buildings—a dozen or so
information palaces in locales as diverse as Council Bluffs, Iowa; St.
Ghislain, Belgium; and soon Hong Kong and Singapore—where an unspecified
but huge number of machines process and deliver the continuing
chronicle of human experience.
This is what makes Google Google:
its physical network, its thousands of fiber miles, and those many
thousands of servers that, in aggregate, add up to the mother of all
clouds. This multibillion-dollar infrastructure allows the company to
index 20 billion web pages a day. To handle more than 3 billion daily
search queries. To conduct millions of ad auctions in real time. To
offer free email storage to 425 million Gmail users. To zip millions of
YouTube videos to users every day. To deliver search results before the
user has finished typing the query. In the near future, when Google
releases the wearable computing platform called Glass,The M3 Parking assist system
has been designed from the ground up to solve traditional car park
problems and more. this infrastructure will power its visual search
results.Shop for high quality wholesale parking sensor system products on DHgate and get worldwide delivery.
The
problem for would-be bards attempting to sing of these data centers has
been that, because Google sees its network as the ultimate competitive
advantage, only critical employees have been permitted even a peek
inside, a prohibition that has most certainly included bards. Until now.
Here I am, in a huge white building in Lenoir, standing near a
reinforced door with a party of Googlers, ready to become that rarest of
species: an outsider who has been inside one of the company’s data
centers and seen the legendary server floor, referred to simply as “the
floor.” My visit is the latest evidence that Google is relaxing its
black-box policy. My hosts include Joe Kava, who’s in charge of building
and maintaining Google’s data centers, and his colleague Vitaly
Gudanets, who populates the facilities with computers and makes sure
they run smoothly.
A sign outside the floor dictates that no one
can enter without hearing protection, either salmon-colored earplugs
that dispensers spit out like trail mix or panda-bear earmuffs like the
ones worn by airline ground crews. (The noise is a high-pitched thrum
from fans that control airflow.) We grab the plugs. Kava holds his hand
up to a security scanner and opens the heavy door. Then we slip into a
thunderdome of data …
Urs H?lzle had never stepped into a data
center before he was hired by Sergey Brin and Larry Page. A hirsute,
soft-spoken Swiss, H?lzle was on leave as a computer science professor
at UC Santa Barbara in February 1999 when his new employers took him to
the Exodus server facility in Santa Clara. Exodus was a colocation site,
or colo, where multiple companies rent floor space. Google’s “cage” sat
next to servers from eBay and other blue-chip Internet companies. But
the search company’s array was the most densely packed and chaotic. Brin
and Page were looking to upgrade the system, which often took a full
3.Parking Guidance for parking management system
and Vehicle Control Solutions,5 seconds to deliver search results and
tended to crash on Mondays. They brought H?zle on to help drive the
effort.
It wouldn’t be easy. Exodus was “a huge mess,” H?lzle
later recalled. And the cramped hodgepodge would soon be strained even
more. Google was not only processing millions of queries every week but
also stepping up the frequency with which it indexed the web, gathering
every bit of online information and putting it into a searchable format.
AdWords—the service that invited advertisers to bid for placement
alongside search results relevant to their wares—involved
computation-heavy processes that were just as demanding as search. Page
had also become obsessed with speed, with delivering search results so
quickly that it gave the illusion of mind reading, a trick that required
even more servers and connections. And the faster Google delivered
results, the more popular it became, creating an even greater burden.
Meanwhile, the company was adding other applications, including a mail
service that would require instant access to many petabytes of storage.
Worse yet,Understanding what it means to study china kung fu
at Shaolin Temple. the tech downturn that left many data centers
underpopulated in the late ’90s was ending, and Google’s future leasing
deals would become much more costly.
No comments:
Post a Comment