馃攳
How Amazon Uses Explosive-Resistant Devices To Transfer Data To AWS - YouTube
Channel: CNBC
[0]
Cloud computing is taking over.
[3]
Demand continues to rise from both
companies and consumers that rely
[6]
on remote storage and computing
power accessible from anywhere.
[10]
Tech giants Google, Microsoft, IBM and
others are vying to be the
[14]
go-to providers.
[16]
But one company has
remained the leader, Amazon.
[20]
AWS has a commanding lead
in the cloud right now.
[23]
In fact, if you add up number two,
three, four and five, they add up
[29]
to what AWS does.
[31]
Amazon Web Services is behind a
lot of the technology we use.
[35]
From calling a Lyft to checking
your video doorbell, to streaming
[38]
your favorite shows.
[39]
When people are watching a Prime
movie, or they're watching a Netflix
[43]
movie, or a Hulu movie, or others
like that, they're watching it and
[48]
streaming off of
Amazon Web Services.
[51]
The Super Bowl streams off us and
also Major League Baseball and now
[55]
NASCAR and Formula One
racing as well.
[58]
If you use Intuit to do
your taxes, that runs on AWS.
[63]
AWS has been one of
Amazon's most profitable business endeavors.
[66]
Last year AWS generated more
than $25 billion in sales.
[69]
Plus, they're still growing
like a weed.
[71]
Get this, they're up 47%.
[73]
In the first quarter of this
year, revenue climbed to $7.7
[76]
billion, up from $5.44
[78]
billion a year earlier.
[80]
We have over 2.2
[82]
million customers using AWS today.
[85]
They're usually big companies like
Goldman Sachs or Capital One.
[89]
There's over 4,000 government agencies
that run on us today.
[92]
Companies left and right are abandoning
their own data centers for
[95]
Amazon's or other cloud providers.
[98]
But moving all of that data
online can be a challenge.
[101]
The transfer fees for moving data over
the network online can be quite
[107]
high. And also, it can take a
while if you have petabytes and
[112]
petabytes and petabytes or
yottabytes of data.
[116]
So Amazon built physical products
to make transferring large amounts
[119]
of data easier.
[121]
A portable data transfer device capable
of operating in a war zone,
[124]
called Snowball.
[126]
And even a giant truck called
Snowmobile to help companies migrate
[130]
their data to the cloud.
[131]
What about if I
have exabytes of data?
[134]
We have a lot of customers
who have exabytes of data.
[139]
And the first thing that came to
mind was, we're going to need a
[142]
bigger box.
[155]
So why would a company need to
move to a cloud service provider like
[158]
AWS?
[159]
Most of our customers save between
22 and 54 percent versus running
[164]
all in, building their own data
center, building their own networks,
[167]
powering it, having people
to operate it.
[169]
One of the biggest reasons that people
look to the cloud is not
[174]
necessarily cost, but
around flexibility.
[177]
Developers can get access to massive
amounts of compute and storage
[184]
and networking resource.
[186]
AWS says it has the largest
global infrastructure footprint of any
[190]
cloud provider, meaning it has data
centers placed in regions around
[194]
the globe where there
is concentrated demand.
[196]
It has the capacity to allow companies
to tap into more server space
[200]
depending on their needs.
[201]
When they have a big retail day
they can use a million servers, when
[206]
their normal load is, say, 40
or 50 or 60 servers.
[209]
And so the ability to do that
is astronomically expensive to do on
[214]
Prem. And that's why you see the
startups growing so fast on AWS
[219]
because they get the access to
a Fortune 500 infrastructure for
[224]
pennies on the dollar.
[226]
Netflix, for example, has always used
Amazon as its cloud provider.
[230]
But for a company that wants to
migrate its data to the cloud,
[233]
typically a massive data transfer
needs to take place.
[236]
Some companies have hundreds of
terabytes, petabytes and even exabytes
[240]
of data.
[241]
For some perspective and how big that
is, your average MP3 song is
[244]
about three megabytes.
[246]
A gigabyte is about a 1,000
megabytes, or around 300 songs.
[251]
A terabyte is about a
1,000 gigabytes, or 300,000 songs.
[255]
A petabytes is 1,000 terabytes, or
300 million songs, and an exabyte
[261]
is around a 1,000 petabytes,
or 300 billion songs.
[266]
A single MP3 file might take a
few seconds to transfer over the
[269]
internet. 300 million or billion,
however, might take a while.
[275]
It's often called in IT the
python eating a pig problem.
[278]
So if you imagine a python, you
can visualize it eating a pig.
[282]
You get this big lump that you
have to move through the python.
[284]
So you have a little network and you've
got a big lump to move and so
[289]
for some of our customers it would've
taken them years and years to
[293]
upload their data
over their network.
[295]
Amazon has tried to solve this problem
of cost and time by creating
[301]
really tough hardware, called Snowballs,
which people who operate
[305]
data centers can connect
their infrastructure to.
[309]
Make copies of the data and then
send those snowballs to AWS data
[314]
centers so that the data
can be moved more quickly.
[319]
The smallest storage Snowball we
have is about 50 terabytes.
[323]
That's 5,000 DVDs and the largest
snowballs we have is between 11,000
[328]
and 14,000 DVDs depending on
how you compress it.
[331]
We work with our Lab 126 folks
on the industrial design, our Kindle
[335]
folks on the e-ink label.
[336]
So imagine if you're shipping hundreds
of these, you could easily put
[339]
the wrong label, put them
in the wrong box.
[341]
That doesn't happen,
it's all automated.
[343]
It knows where it's going
and it labels itself.
[346]
And my boss, Charlie Bell, worked
on the Space Shuttle and we
[350]
actually used some things off the
Space Shuttle where they have to
[354]
handle the shock of
launch and landing.
[357]
Vass said designing the Snowball to
withstand the rigors of transit
[360]
was not an easy task, since it had
to be highly durable as well as
[364]
less than 50 pounds.
[366]
We actually went to our shipping partners
and we also went to the
[370]
fulfillment center and
talked to them.
[372]
So from that we learned a really hard
problem to solve is that it had
[376]
to be under 50 pounds.
[377]
We also wanted people to be able
to check it as regular luggage.
[380]
And that's actually a hard design
constraint, to make something as
[384]
durable as that and as dense compute
and storage, in under 50 pounds.
[388]
The Snowball even passed an explosives
test and meets the military's
[392]
requirements for being airdropped.
[394]
To meet the specifications, we have
to drop the Snowball from 28
[398]
feet, 80 times, on all four
corners and all six sides.
[403]
And then because we build it so
robustly, we are able to also pass
[410]
the DoD 901 Barge Explosive Test, where
you have 83 pounds of plastic
[414]
explosive going off 20 feet
from the device multiple times.
[419]
Which is a tremendous percussion wave
that would turn your insides to
[424]
jello. If you were standing
there it would kill you.
[427]
And temperature wise, it's designed
for the most extreme environments.
[432]
They can operate at really
high temperatures, like 140 degrees
[436]
ambient temperature and really
cold temperatures, like -20.
[440]
And it can have unconditioned power
from a generator and it'll
[443]
continue to operate.
[445]
For customers calmly transferring data
from the safety of their
[447]
office, this could all
seem like overkill.
[450]
But in certain instances,
it's proven critical.
[453]
When the volcano was going off
in Hawaii, the USGS used the
[457]
Snowballs. They had local servers and the
lava was coming up on their
[462]
building. And so they didn't want
to lose all that extremely valuable
[466]
data they've collected.
[467]
They also knew the Snowballs
could operate in high temperature
[469]
environments. And so they shipped
the Snowballs there, downloaded the
[473]
data and shipped the Snowballs out.
[475]
And so they were able to capture
all that data without losing it.
[478]
Oil rigs is another area we
see a lot of them.
[480]
Military they're very,
very popular.
[482]
So they're used in
forward deployed units.
[485]
They're used on Navy ships.
[487]
They're used in aircraft.
[488]
They're used in Special Ops
locations all over the world.
[492]
For cybersecurity, where they're collecting
network data and reacting
[496]
to it locally.
[497]
There's even a six micron dust filter
option you can snap on the
[502]
front. So if you're operating them in
a desert, they can filter the
[505]
sand out and not have the
sand go into the device.
[509]
Even Hollywood has started taking
advantage of Amazon's Snow family.
[512]
For studio shoots, they're shooting in
8K and 12K cameras now.
[516]
And so that's a lot of data.
[517]
And so they put it on the Snowballs,
and you can see the screens on
[521]
the front they're used
for quality control.
[523]
When they're done with the shoot
they actually ship the Snowball back
[526]
and it uploads the data into
the cloud and then they post-processing
[529]
it.
[529]
There's another, I would call the
upper-sell version, which is the
[533]
Snowball Edge, and that's
a 100 terabyte solution.
[537]
Now, the interesting thing about Snowball
Edge is you can actually
[540]
put compute on there
and actually run workloads.
[544]
AWS was the first public cloud provider
to make hardware like this for
[548]
data transfer, but competitors have
since developed similar products.
[554]
Microsoft is the number two player
in the public cloud market, behind
[557]
AWS. It has Data Box products that
have room for 1 petabyte of data,
[563]
making it larger than what
Google and IBM offer today.
[567]
The Google Cloud, which is behind
AWS and Microsoft Azure, has
[571]
Transfer Appliance products.
[573]
Which are storage servers that you can
install inside a rack in your
[577]
data center.
[577]
But it's not as popular as
the one that AWS is offering
[585]
today.
[586]
But AWS is the only company that
felt like it needed to go even
[589]
bigger. Snowmobile has the equivalent
of 1,250 Snowballs in it.
[595]
And so it's what we
call a 100 petabytes truck.
[598]
To put in context how
much data Snowmobile can take.
[602]
Let's say the typical
notebook is 500 gigabytes.
[605]
A 100 petabytes would be 200
million notebooks that get ingested
[612]
into this Mack truck.
[615]
Digital Globe had this challenge where
they had a huge amount of
[618]
satellite imagery.
[619]
They're one of the largest producers
in the world of satellite
[622]
imagery. And so it would have taken
them about 10 years to upload it
[628]
over the network.
[629]
And they were actually the first
customer that called us and said,
[631]
hey, can't you just send a truck?
[633]
And so we built one.
[635]
We'll drive the truck up
to your data center.
[637]
We have power and network fiber
that will connect to your data
[641]
center. Fill 'er up, and then the
truck will come back, put the
[646]
trailer back on the truck and
we'll move it back to AWS.
[649]
If you think about the idea of
moving an exabyte of data, if you
[654]
basically assign a 10 gigabit per second
line to it, which is pretty
[658]
reasonable, it would take
you about 26 years.
[662]
Using 10 Snowmobiles, it would take
you a little less than six
[666]
months.
[667]
I remember a few years ago, AWS
announced the Snowmobile by driving it
[671]
onto a stage at an AWS
event and people just went nuts.
[675]
They were like, what?
[676]
How could a cloud be a truck?
[678]
And it was cool.
[680]
It was innovative.
[681]
But it's not like we've heard
about the Snowmobile being a huge
[684]
business hit.
[685]
It's not like the Snowmobile is
what's making up most of AWS'
[689]
revenue. Far from it.
[691]
AWS would not disclose how many
customers have used a Snowmobile to
[694]
make a transfer, or how many
trucks it has in service.
[698]
We weren't able to see the inside
of the truck because the technology
[700]
is safely guarded, but it's essentially
a data center on wheels.
[705]
Snowmobile uses what Amazon calls
Zero-G racks, which suspend the
[709]
system from both the top and the
bottom of the truck to handle the
[712]
impacts while in motion.
[713]
And it has its
own power and cooling.
[716]
Once the transfer is complete,
the truck enters transport mode.
[720]
An armed guard and an escort accompany
the truck as it returns to
[723]
Amazon's data centers
for the upload.
[726]
Its location is monitored over
cellular and satellite communications
[729]
throughout the entire journey.
[731]
Amazon intentionally left the truck devoid
of branding to keep it
[734]
discreet. When it connects to
the AWS ingestion center.
[739]
The data is decrypted and hashed.
[741]
Validated, and then once the data
is all unloaded and you validated
[745]
it's unloaded, then the Snowball set
up for another run for another
[747]
customer.
[749]
Just as Amazon has disrupted retail
with its e-commerce business, the
[752]
Snow products are an example of how
the company has become a force to
[756]
reckon with in the cloud
computing industry as well.
[759]
They're continually first
with features.
[762]
They're first with different ways
of doing things, like networking.
[766]
They have the
most compute variation.
[769]
They have the widest range
of machine learning offerings too.
[773]
Two years ago, we deployed
1,440 new products and services.
[778]
We deployed 1,954, that
was in 2018.
[780]
I'm sure we will beat that in 2019.
[784]
So innovation has been a key point.
[787]
Security has been a key point.
[788]
Our systems are the only cloud
that's certified to run the
[792]
intelligence agencies and DoD type
clouds, top secret compartmented
[795]
level of certification.
[797]
And we continue to have just a
wide variety with the most number of
[801]
databases available to our customers
and the most variety of
[805]
databases. The most robust storage
platforms, the most number of
[810]
options and compute.
[812]
In addition to that, we've just
been doing it longer than anybody
[814]
else. So we have these years
and years of operational excellence and
[817]
experience behind it.
Most Recent Videos:
You can go back to the homepage right here: Homepage





