My Authors
Read all threads
lol. So, we're doing some image processing with TPUs. We want to save the results directly to our cloud bucket, rather than having the results be transmitted to our VM, saved locally, then uploaded to our cloud bucket. Got a funny idea...

I guess this will be a ramble:
TPUs support a limited number of operations. But what you get in exchange is a blazingly-fast TPU.

A TPU consists of 8 cores, plus a CPU. (Yes, the TPU has a CPU -- weird concept, but think of it like a big computer with 8 GPUs. Obviously, a computer with GPUs has a CPU.)
In the same way that GPUs are much more restrictive than CPUs – it's a lot easier to write programs for CPUs than GPUs! – the TPU cores are much more restrictive than the TPU's CPU.

But that's a positive statement. It means you get some nice flexibility with the TPU's CPU.
These exact differences seem to be mostly undocumented at the moment, though I hope to change that. But, one extremely nice feature is that you can generate tensorboard logs directly from the TPU's CPU – no need for the data to pass through your VM. It gets saved right to gcloud.
Now, one thing you *can't* do with TPUs is save files locally. But think about it like "the TPU is a gigantic computer running in a factory somewhere." (It is!) Obviously, if you're SSH'd into that computer, you can't access your local HD. Not without special magic like sshfs.
So that poses a problem on two fronts. First, say you have a bunch of images. You want the TPU to train on those images. Ok, so you put the images on your server connected to the TPU, and you're good to go, right? Bzzt. Those images are *on your local HD*; the TPU can't see them.
How do the TPUs get data? You convert it into TFRecord format, then upload it to a cloud bucket. TPUs have special instructions to parse TFRecord files specifically. In the same way they can parse JPG files, they can also parse .tfrecord files.
And that's the big speedup. It's one of the main reasons TPUs are so ungodly *fast* when you use them properly.

When you're training on a GPU, the data is being streamed from your harddrive to the GPUs. Slow harddrive = slow training.

Cloud buckets are fast, and local to TPUs.
But what about the other direction? Say you want to process a bunch of images somehow. Suppose you're @citnaj and you want to implement DeOldify on TPUs. Obviously, you want users to be able to see the results – that's the whole point! So how do you get the data off the TPU?
The most obvious way to do it is to pull down the data as it's generated. This can work, and it's what I've normally done till now. The link between your GCE VM <-> TPU is around 500 MB/s (that's mega*bytes*, not bits) which normally is fast enough that you don't need to care.
But say you're @citnaj on a high dose of LSD. You start tripping hard, and your brain starts going places. Places like "what if we want to get, like, a *lot* of results from the TPU, really really quickly? Far more than we could feasibly stream to our local VM?"
We're now departing from the realm of practical-and-useful to theoretical-for-unknown-reasons. Realistically you probably won't need to write data directly from the TPU back to your cloud bucket. But if you did, is there any way to do it? TPUs don't seem to implement file I/O.
If you've made it this far, then congratulations. You're now prepared to hear the punchline that originally cracked me up.

The answer is, yes, TPUs can save the data. Remember how I said that TPUs can save training logs for tensorboard? Those logs are literally just data.
So if you need to save a shitload of images from TPUs, and for some reason you don't want to stream them locally to your server, you can still do it! Just save the images as a tf.summary.image (…) directly from the TPU's CPU.

Laughed for like 5min.
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Shawn Presser

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!