In '87, Huey released this; Fore!, their most accomplished album. I think their undisputed masterpiece is "Hip To Be Square". A song so catchy, most people probably don't listen to the lyrics. But they should, because it's not just about the pleasures of conformity and the importance of trends. It's also a personal statement about the band itself.
Its amazing how impressive it sounds when the truth is twisted. I can post a screenshot of a modern videogame and said "this image took the equivalent of 100 million CPU hours to compute" with a tiny footnote like "*if calculated on a commadore 64"
Some people think "compute" can be a noun. Those people are wrong and should remember the word "computation" which is still a viable word and doesn't need to be thrown into the trash can.
Can you explain to me the wording for "Compute Pipeline" then?
Not being condescending, I'm a developer who works with graphics APIs for GPU computation, not a linguist. This wording is the only wording I've ever seen in documentation and it's always been used as a noun or a verb, never just a verb.
There's no good reason for it not to be "computation pipeline" or "computing pipeline" instead. That being said, there's no "pipe" in it, so "pipeline" itself doesn't make sense. A series of different computations being done on transformed data is more like a factory assembly line than a pipeline, but "pipeline" is shorter, so it's not too far off.
RE: Pipeline
I wouldn't say the data is transformed from one to the next, they do flow without major modifications from one step to the next. Take one frame from a particle emitter with a trail for example:
- Renderer prepares a texture from what is shown on the screen
- Vertex Shader moves the particle around the space
- Texture Shader (or Fragment Shader, depending on the API) fades the alpha value of the previous texture in the texture buffer to create a fading trail effect
- Texture Shader 2 applies the vertices calculated in the Vertex Shader to the texture as opaque points to be the new head of the trail
We don't typically use "[Factory](https://www.tutorialspoint.com/design_pattern/images/factory_pattern_uml_diagram.jpg)" unless it's a Class that builds another Class with some inheritance. Pipelines are a point A to point B system.
EDIT: plaque -> opaque
I guess it’s just an ease of use thing then? Like how people don’t tend to use big words like “elephantine” or “commodious” without very good reason when words like “big” and “huge” exist?
Especially in stuff like tech where communication between a team is massively important, you want to keep as many words as simple as possible, hence we use terms like “compute pipeline” rather than the much more painful to use “computational conveyor belt”.
You're joking, right? This is talking about the Event Horizon Team image of Sagittarius A*, the supermassive black hole at the centre of the Milky Way, with the image made by interferometry of radio telescopes.
I believe the point is that creating the image consumed that much computational power. The title makes it sound like all of that computational power actually was consumed by the black hole itself.
Yes, but the image is simulated.
>To that effort, the **researchers created a simulation library of black holes** that leveraged the known physical properties of black holes, general relativity, and a number of other scientific areas. The idea was that this library could parse the enormous amount of data captured by the EHT array into an actual, viewable image
It probably took Somewhere between 1 computer spending 100 million hours computing or 100 million computers spending one hour computing to create this composite image.
I could have gotten the same effect in 2 minutes with a few LED lights a dark room and a bagel
How do we count computers now that one computer can have many processors? Don’t we need some scientific unit of measurement here, like horsepower? That really has little to do with actual horses these days.
Supercomputerologist here. The title will refer to the number of core hours. The supercomputer is Frontera at the Texas Advanced Computing Center. Their main cluster has 8,008 nodes (computers) with 2 sockets (CPUs), each with 28 cores. So it would have taken the whole cluster about 9 days to calculate.
But cores can vary in capability, so the supercomputer equivalent of horsepower is FLOPS, floating point (arithmetic) operations per second. Their cluster is around 25 petaflops. For comparison my desktop will do a few hundred gigaflops and [email protected], the (soon to be) most powerful supercomputer in the world, will do about 1.5 exaflops.
Plus, they read the article:
> The vast majority of the required computing hours - around 80 million - were run on TACC’s Frontera system, a 23.5 petaflops, CentOS Linux 7-based Dell system currently ranking 13th on supercomputing's Top500 list. Frontera leverages 448,448 CPU cores courtesy of 16,016 units of Intel's Xeon Platinum 8280 chips, a Broadwell-class CPU leveraging 28 Intel cores running at 2.7GHz.
Not how it works
It's like saying a highway spins at 1.2 billion RPM because of how many engines are on it
It doesn't tell you how much power is available or anything at all really
GHz is no longer an indication of processing power. Processing power can only be measured by making it do work
This system has as a calculated peak performance of 38,745.9 Teraflops =
**38,745,900,000,000,000** floating point operations per second.
The fastest existing computer is in Japan and is calculated to perform at 537,212.0 Teraflops.
Subtract 5-10% for multiprocessor message passing and remote direct memory access, since the motherboard interconnects are not as fast as direct memory access.
That said, they are connecting the motherboards using Mellanox InfiniBand HDR, which can transmit data at 20 GB/s.
Something is not adding up from the numbers in the title alone if you want to shed some light.
11,425 years = 99,995,400 hours
I assume this time metric is for a single computational core doing the full calculation.
Divide the core hours by the wallclock time to get the number of cores.
But 100MM core hours divided by 99995400 hours gives 1.000046002116097 cores, which is not that impressive.
What am I doing wrong?
Edit: grammar
If you're trying to get the number of cores that were used, you would need to divide out the walltime, which you don't have. You used 11,000 years as the walltime, which is what the article says the walltime would have been if you were doing the computation with a single core. So, of course, when you divide the total CPU-hours by that number, you get 1. That's what CPU-hr is.
To add to the other comment, the reason it is CPU hours is because a core can only do one thing at a time. So you are paying for the time to use that core. Prices can be higher depending on the type of core that is being rented.
There are only a few super computers in the world and researches have to rent time, or at least schedule time on the machines, and they bill on the combined time that each core used.
I’ve worked with a supercomputer processing a large workload one time. The unit of measurement is core hours. So if you have a 6 core processor that’s utilizing all cores to process whatever you’re doing and it takes 2 hours to finish the job, that means you used 12 core hours
Supercomputers are typically measured in FLOPs (floating point operations per second) basically any math operation like addition or subtraction, multiplication or division. Right now the best supercomputers are measured in the hundreds of thousands of teraflops while an Xbox series X can manage 12, allegedly. To your point cpu hours is a meaningless term, but I chalk that up as more of a science journalism failure than anything else. They probably just reported the number of cores that were rented by the hour on a particular supercomputer
cpu hour is how scientific efforts are charged for resource consumption. They must request and be approved for a research grant equal to or greater than the cost of resources required to complete their work.
Source: former supercompter admin.
Just lookup a benchmark for the processor used and run with that. Precision doesn't matter here since you're rounding to the nearest order of magnitude anyways.
>I could have gotten the same effect in 2 minutes with a few LED lights a dark room and a bagel
I would love to see you try and capture and image of said bagel if it was on the moon, because that's the size of that black hole as it appears on our sky.
per the article
>The vast majority of the required computing hours - around 80 million - were run on TACC’s Frontera system, a 23.5 petaflops, CentOS Linux 7-based Dell system currently ranking 13th on supercomputing's Top500 list. Frontera leverages 448,448 CPU cores courtesy of 16,016 units of Intel's Xeon Platinum 8280 chips, a Broadwell-class CPU leveraging 28 Intel cores running at 2.7GHz. The remainder 20 million simulation hours were computed on the NSF's open Science Grid, which leverages unused CPU cycles in a distributed computing fashion to unlock compute capabilities without the need to deploy costly supercomputers and related infrastructure.
We can’t take a picture of a black hole with one telescope, our stuff isn’t sensitive enough and they are too far away. The only way to do it is to point a bunch of telescopes at it over a long time, coordinated around the world as it spins. The data those telescopes capture is immense. So that data needs to be crunched by super computers in order to turn it into an image. If one computer worked on it alone it would take 11415 years.
Ok so **creating a composite image of a black hole consumed 100 million CPU hours.** A black hole did not consume any CPU hours because that obviously makes no sense.
So it used a supercomputer to process this whereas your laptop would be chugging away for millennia to try to get to the end. The laptop would likely die of old age before getting anywhere near done with a portion of the work.
It means \[apparently\] the image took around 200 real hours to compute. It's called journalism.
Ofc, those hours spent do not speak to the accuracy of the derived image.
For that matter ofc, it's possible this reply is as misleading as the title.
Honest question: Is there confidence this is actually what it looks like? Or did they write a complex algorithm with their assumption on what it should look like, input a bunch of "space noise", and the program returned an image that met their initial parameters (assumptions)?
I don't mean to be too cynical. I just don't understand where this falls on the spectrum from "Wow this is ground breaking stuff" to "garbage in, garbage out".
On a scale of garbage to groundbreaking stuff, this is in the "will be remembered as one of the most scientifically important photos of the twenty-first century" category. Basically, they were able to take a bunch of telescopes from points all over the Earth, from Hawaii to Europe to the South Pole, and tie them together. By using the slightly different positions from all over the planet, a computer was basically able to simulate a telescope with a mirror size of the entire Earth, and use that to fill in the gaps in the information. It's absolutely a real image made up of data sets from multiple telescopes. When you're looking at something that far away though, there's going to be a lot of static. The computer is able to take the slightly different positions and figure out how much is black hole, and how much is static by comparing the images.
Thank you for the explanation! I got stuck on the part about creation of the simulation library, which made me think "they told it what image to create, so shocking result, it created the image they asked for".
You are the only real answer or information in this entire post. Why is this website so so so bad now? What is going on? Each day it gets worse and I tell myself to just leave - but I don’t. Then I come back the next day and say the same thing.
Fuck. Getting to the point of no return.
An easy way to empower yourself on reddit is to read the article (not assuming that you didn’t!). You’ll begin to notice that few replies engage the content of the article (therefore making you the top 1% of commenters, hooray,) and will be able to filter/focus on the info that matters to you and what you wanna know.
That means nothing. 100 million per-cpu hours, or about 6.25 million average Ryzen-hours.
Or, it would have been trillions of 6502-hours, or about five minutes on an imaginary future super-cpu.
At least the article had some details on the actual hardware.
It was very interesting. It helps explain why the computing power is necessary. Pretty crazy that stripes mixed with other stripes and more stripes makes the image.
Help a non-scientist out. If this black hole is 27 million light years away from us, we are looking at the black hole, now, the way it appeared X years ago….
Guess all the stupid and smart ass responses are to be expected. This is an amazing find. I could explain why, but you mentally unstable dipshits wouldn’t understand
Tldr; they created an image from light that had escaped a black hole 400 million times the size of our sun.
I did something similar with photoshop and it took me 1/10th of a human hour
It seems like a black hole would consume an immeasurable amount of something so familiar. I don't really get it. Plus too lazy to click and read. Back to napping.
So you know how when you land in the tower and you need to go see Hawthorne but you’re not sure if you should go to the main tower area or fast travel to the far right side? Like you’re not sure which is the faster path? Alright, so basically it’s whether we wanted it or not, we've stepped into a war with the Cabal on Mars. So let's get to taking out their command, one by one. Valus Ta'aurc. From what I can gather he commands the Siege Dancers from an Imperial Land Tank outside of Rubicon. He's well protected, but with the right team, we can punch through those defenses, take this beast out, and break their grip on Freehold.
Impressive…
Let’s see Paul Allen’s black hole
Look at that subtle persimmon coloring… the tasteful thickness of it…
Oh my god. It even has an event horizon.
My God, it's even full of stars
Are you sweating, Neil deGrasse Tyson?
No I have a condition that prevents me from sweating
You haven’t been to a Pizza Hut in Wokingham?
Pizza Express, please
Great now I want spaghetti.
You probably eat horrible pasta if anything on this thread made you want past. Lol
Good god I love the path this thread is tracing
I have to return some videotapes.
I tattooed this on someone yesterday hah.
I need to see this.
Be kind. Rewind.
thank you for this
This is perfect.
Once you come close enough, you have no choice but to enter it 😩
Something wrong? 😳 Patrick? 😖 You're sweating! 😥💦💦
Literally my favorite comment
this comment delivers
Do you like Huey Lewis and The News
In '87, Huey released this; Fore!, their most accomplished album. I think their undisputed masterpiece is "Hip To Be Square". A song so catchy, most people probably don't listen to the lyrics. But they should, because it's not just about the pleasures of conformity and the importance of trends. It's also a personal statement about the band itself.
HEY PAUL!!
**TRY TO GET A RESERVATION AT DORSIA NOW YOU FUCKING STUPID BASTARD!**
This sound so sus
I fucking hate you.
Sussy 🚌
this guy fucks
Why, tired of the expression?
Hmmmmmm, I love me a "black hole"
Most impressive.
Obi-Wan has taught you well
Its amazing how impressive it sounds when the truth is twisted. I can post a screenshot of a modern videogame and said "this image took the equivalent of 100 million CPU hours to compute" with a tiny footnote like "*if calculated on a commadore 64"
Even longer if you get 2nd graders to hand calculate it
Note that the *image processing* consumed all that and not the black hole itself. 🙂
Happy cake day your parents must be really proud
Unicornslaps dad here. Yes. I’m proud.
Thanks Dad!
r/titlegore
Some people think "compute" can be a noun. Those people are wrong and should remember the word "computation" which is still a viable word and doesn't need to be thrown into the trash can.
Can you explain to me the wording for "Compute Pipeline" then? Not being condescending, I'm a developer who works with graphics APIs for GPU computation, not a linguist. This wording is the only wording I've ever seen in documentation and it's always been used as a noun or a verb, never just a verb.
There's no good reason for it not to be "computation pipeline" or "computing pipeline" instead. That being said, there's no "pipe" in it, so "pipeline" itself doesn't make sense. A series of different computations being done on transformed data is more like a factory assembly line than a pipeline, but "pipeline" is shorter, so it's not too far off.
RE: Pipeline I wouldn't say the data is transformed from one to the next, they do flow without major modifications from one step to the next. Take one frame from a particle emitter with a trail for example: - Renderer prepares a texture from what is shown on the screen - Vertex Shader moves the particle around the space - Texture Shader (or Fragment Shader, depending on the API) fades the alpha value of the previous texture in the texture buffer to create a fading trail effect - Texture Shader 2 applies the vertices calculated in the Vertex Shader to the texture as opaque points to be the new head of the trail We don't typically use "[Factory](https://www.tutorialspoint.com/design_pattern/images/factory_pattern_uml_diagram.jpg)" unless it's a Class that builds another Class with some inheritance. Pipelines are a point A to point B system. EDIT: plaque -> opaque
I guess it’s just an ease of use thing then? Like how people don’t tend to use big words like “elephantine” or “commodious” without very good reason when words like “big” and “huge” exist? Especially in stuff like tech where communication between a team is massively important, you want to keep as many words as simple as possible, hence we use terms like “compute pipeline” rather than the much more painful to use “computational conveyor belt”.
Some people don't seem to work in tech I guess
apparently the hole also consumed the title of this post, at least partially
This clickbait title should have at least included the words "Simulated Image of ____". Its not a real black hole
It is a real black hole.
You're joking, right? This is talking about the Event Horizon Team image of Sagittarius A*, the supermassive black hole at the centre of the Milky Way, with the image made by interferometry of radio telescopes.
I believe the point is that creating the image consumed that much computational power. The title makes it sound like all of that computational power actually was consumed by the black hole itself.
Wait, so the real black hole consumed all those CPU cycles? It mist be way closer than the astronomers think...
Yes, but the image is simulated. >To that effort, the **researchers created a simulation library of black holes** that leveraged the known physical properties of black holes, general relativity, and a number of other scientific areas. The idea was that this library could parse the enormous amount of data captured by the EHT array into an actual, viewable image
Doesn't anyone understand what that title means? In a language that a real person could understand
It probably took Somewhere between 1 computer spending 100 million hours computing or 100 million computers spending one hour computing to create this composite image. I could have gotten the same effect in 2 minutes with a few LED lights a dark room and a bagel
How do we count computers now that one computer can have many processors? Don’t we need some scientific unit of measurement here, like horsepower? That really has little to do with actual horses these days.
Supercomputerologist here. The title will refer to the number of core hours. The supercomputer is Frontera at the Texas Advanced Computing Center. Their main cluster has 8,008 nodes (computers) with 2 sockets (CPUs), each with 28 cores. So it would have taken the whole cluster about 9 days to calculate. But cores can vary in capability, so the supercomputer equivalent of horsepower is FLOPS, floating point (arithmetic) operations per second. Their cluster is around 25 petaflops. For comparison my desktop will do a few hundred gigaflops and [email protected], the (soon to be) most powerful supercomputer in the world, will do about 1.5 exaflops.
An intelligent answer on Reddit!?
Plus, they read the article: > The vast majority of the required computing hours - around 80 million - were run on TACC’s Frontera system, a 23.5 petaflops, CentOS Linux 7-based Dell system currently ranking 13th on supercomputing's Top500 list. Frontera leverages 448,448 CPU cores courtesy of 16,016 units of Intel's Xeon Platinum 8280 chips, a Broadwell-class CPU leveraging 28 Intel cores running at 2.7GHz.
For anyone wondering like me…. that comes out to… **1,210,809.6** Gigahertz One point two million Gigahertz 🤤
1.21 (M) Gigahertz? Great Scott!
All you need is a little plutonium.
THERE it is, 8 responses down…
1.21 Petahertz
That’s heavy, Doc.
Not how it works It's like saying a highway spins at 1.2 billion RPM because of how many engines are on it It doesn't tell you how much power is available or anything at all really GHz is no longer an indication of processing power. Processing power can only be measured by making it do work
This system has as a calculated peak performance of 38,745.9 Teraflops = **38,745,900,000,000,000** floating point operations per second. The fastest existing computer is in Japan and is calculated to perform at 537,212.0 Teraflops.
Great Scott!
Subtract 5-10% for multiprocessor message passing and remote direct memory access, since the motherboard interconnects are not as fast as direct memory access. That said, they are connecting the motherboards using Mellanox InfiniBand HDR, which can transmit data at 20 GB/s.
[удалено]
28 cores looks like "tubes" now lol boobs and tubes
Suck it, Trebek
Something is not adding up from the numbers in the title alone if you want to shed some light. 11,425 years = 99,995,400 hours I assume this time metric is for a single computational core doing the full calculation. Divide the core hours by the wallclock time to get the number of cores. But 100MM core hours divided by 99995400 hours gives 1.000046002116097 cores, which is not that impressive. What am I doing wrong? Edit: grammar
11,415 years = 4,166,475 days 8,008 nodes * 2 sockets * 28 cores = 448,448 cores 4,166,475 days / 448,448 cores = 9.3 days / core
If you're trying to get the number of cores that were used, you would need to divide out the walltime, which you don't have. You used 11,000 years as the walltime, which is what the article says the walltime would have been if you were doing the computation with a single core. So, of course, when you divide the total CPU-hours by that number, you get 1. That's what CPU-hr is.
Thanks man!
Supercomputerologist sounds really cool
To add to the other comment, the reason it is CPU hours is because a core can only do one thing at a time. So you are paying for the time to use that core. Prices can be higher depending on the type of core that is being rented. There are only a few super computers in the world and researches have to rent time, or at least schedule time on the machines, and they bill on the combined time that each core used.
I’ve worked with a supercomputer processing a large workload one time. The unit of measurement is core hours. So if you have a 6 core processor that’s utilizing all cores to process whatever you’re doing and it takes 2 hours to finish the job, that means you used 12 core hours
Well… CPU hours would be the unit of measurement lol
But without a standard for what a CPU can calculate in an hour it's meaningless. I had a Mac SE that took an hour to load a JPEG.
Supercomputers are typically measured in FLOPs (floating point operations per second) basically any math operation like addition or subtraction, multiplication or division. Right now the best supercomputers are measured in the hundreds of thousands of teraflops while an Xbox series X can manage 12, allegedly. To your point cpu hours is a meaningless term, but I chalk that up as more of a science journalism failure than anything else. They probably just reported the number of cores that were rented by the hour on a particular supercomputer
cpu hour is how scientific efforts are charged for resource consumption. They must request and be approved for a research grant equal to or greater than the cost of resources required to complete their work. Source: former supercompter admin.
Just lookup a benchmark for the processor used and run with that. Precision doesn't matter here since you're rounding to the nearest order of magnitude anyways.
>I could have gotten the same effect in 2 minutes with a few LED lights a dark room and a bagel I would love to see you try and capture and image of said bagel if it was on the moon, because that's the size of that black hole as it appears on our sky.
Ah a fellow Veritasium watcher ;)
Small indie channel.
Pretty sure they originally used that analogy in the ESO press conference. Although it might have been a doughnut.
im gonna assume you're joking, the image is the culmination of processing mountains of data involving light as it bends around the black hole.
Should’ve just used a bagel that’s way easier
You can use an everything bagel to get that grainy effect
I think I accidentally took a photo like this with my camera phone in 2007.
[удалено]
Or you could just say it’s long baseline interferometry.
This guy lox.
per the article >The vast majority of the required computing hours - around 80 million - were run on TACC’s Frontera system, a 23.5 petaflops, CentOS Linux 7-based Dell system currently ranking 13th on supercomputing's Top500 list. Frontera leverages 448,448 CPU cores courtesy of 16,016 units of Intel's Xeon Platinum 8280 chips, a Broadwell-class CPU leveraging 28 Intel cores running at 2.7GHz. The remainder 20 million simulation hours were computed on the NSF's open Science Grid, which leverages unused CPU cycles in a distributed computing fashion to unlock compute capabilities without the need to deploy costly supercomputers and related infrastructure.
Dude, they bought a Dell.
He asked in a language a real person could understand...
Xeon CPUs are a beast in their branch, incredible tech
We can’t take a picture of a black hole with one telescope, our stuff isn’t sensitive enough and they are too far away. The only way to do it is to point a bunch of telescopes at it over a long time, coordinated around the world as it spins. The data those telescopes capture is immense. So that data needs to be crunched by super computers in order to turn it into an image. If one computer worked on it alone it would take 11415 years.
Ok so **creating a composite image of a black hole consumed 100 million CPU hours.** A black hole did not consume any CPU hours because that obviously makes no sense.
So it used a supercomputer to process this whereas your laptop would be chugging away for millennia to try to get to the end. The laptop would likely die of old age before getting anywhere near done with a portion of the work.
It means \[apparently\] the image took around 200 real hours to compute. It's called journalism. Ofc, those hours spent do not speak to the accuracy of the derived image. For that matter ofc, it's possible this reply is as misleading as the title.
I copy pasted 16 million IFs inside of a for loop to make this image. It look a long ass time
That is inefficiency on several levels, congratz.
I loved that album.
Glaciers melting in the dead of night..
And the superstars stuck into the super massive (you set my soul on fire)
I read this to the tune of “Blackbird” by the Beatles. Was that intended? Because I can’t not read it that way.
Why does that sound like the first verse to Blackbird by The Beatles? :0
Vampire baseball!
Ah a person of culture
Honest question: Is there confidence this is actually what it looks like? Or did they write a complex algorithm with their assumption on what it should look like, input a bunch of "space noise", and the program returned an image that met their initial parameters (assumptions)? I don't mean to be too cynical. I just don't understand where this falls on the spectrum from "Wow this is ground breaking stuff" to "garbage in, garbage out".
On a scale of garbage to groundbreaking stuff, this is in the "will be remembered as one of the most scientifically important photos of the twenty-first century" category. Basically, they were able to take a bunch of telescopes from points all over the Earth, from Hawaii to Europe to the South Pole, and tie them together. By using the slightly different positions from all over the planet, a computer was basically able to simulate a telescope with a mirror size of the entire Earth, and use that to fill in the gaps in the information. It's absolutely a real image made up of data sets from multiple telescopes. When you're looking at something that far away though, there's going to be a lot of static. The computer is able to take the slightly different positions and figure out how much is black hole, and how much is static by comparing the images.
Your answer is the clearest for me. Thank you!
Veritasium on YouTube has a cool video on it.
Thank you for the explanation! I got stuck on the part about creation of the simulation library, which made me think "they told it what image to create, so shocking result, it created the image they asked for".
You are the only real answer or information in this entire post. Why is this website so so so bad now? What is going on? Each day it gets worse and I tell myself to just leave - but I don’t. Then I come back the next day and say the same thing. Fuck. Getting to the point of no return.
An easy way to empower yourself on reddit is to read the article (not assuming that you didn’t!). You’ll begin to notice that few replies engage the content of the article (therefore making you the top 1% of commenters, hooray,) and will be able to filter/focus on the info that matters to you and what you wanna know.
Glaciers melting in the dead of night
For some reason I heard this to the tune of Blackbird by the Beatles
Take that melted ice and learn to slide
All the tides Sooner to be rising, once the glaciers melt from ice
Same.
Me too
Take these BTUs and learn to swim
Ah a person of culture
Rather this than mining imaginary internet money.
That means nothing. 100 million per-cpu hours, or about 6.25 million average Ryzen-hours. Or, it would have been trillions of 6502-hours, or about five minutes on an imaginary future super-cpu. At least the article had some details on the actual hardware.
Explain in Star Wars terms
Consumed 100 million hours of CPU, image of black hole did, hmmm?
Choked on my bong hit
It's the ship that made the Kessel Run in less than twelve parsecs!
That hunk of junk?
1 trillion fart hours.
Whoah, that’s a lot.
How many giraffes? 📏🦒
CPU-hour is a standardized unit. It is reported in terms of a 1 GFLOP reference machine.
Explain in Big Macs
2 quarter pounders and a double quarter pounder had an orgy and the mess left over was Mac sauce
Oh my!
Really recommend checking out [Veritasium’s video](https://youtu.be/Q1bSDnuIPbo) on this.
It was very interesting. It helps explain why the computing power is necessary. Pretty crazy that stripes mixed with other stripes and more stripes makes the image.
That’s ridiculous. That pictured loaded up on my phone almost instantly. Apple 💪🏻.
Help a non-scientist out. If this black hole is 27 million light years away from us, we are looking at the black hole, now, the way it appeared X years ago….
Correct. We can only see it as it was then.
That’s just the front cover of Soundgarden’s *[Superunknown](https://en.m.wikipedia.org/wiki/Superunknown)*.
So they mined 1.4 bitcoins sounds like. Sweet.
I’m way too stupid to understand what this means
I am to stupid to understand what your comment means.
Now I’m even more confused.
All we has the are stupid
Help.
I could have drawn that in an 20 min
Photos like this make me laugh because it's just a blurry fire donut and also one of the most impressive photos in human history at the same time lol
I have a bunch of photos just like that, I took them accidentally while covering the flash with my index finger, should've just called me smh
So that's why GPU prices skyrocketed
So it took hundreds of man hours, probably millions of dollars, time and energy, just to get a picture marked nsfw? What a ripoff /s
Imagine the bitcoin they could have farmed.
What?
For curious programmers out there, the black hole images are generated with an open source Python script: https://github.com/achael/eht-imaging
I know what I’m doing for my next PC build! *wait*
What’s a “CPU hour”?
[удалено]
girl what?
Anyone else be wanting to just hop in one or is it just me? 😅
Laugh.
Where are the climate activists in the chat LMAO
And still didn’t actually image anything
This doesn't impress me one bit...
Just for scale, how much bitcoin would that mine?
Black hole sun Won't you come And wash away the rain? Black hole sun Won't you come Won't you come Won't you come
Guess all the stupid and smart ass responses are to be expected. This is an amazing find. I could explain why, but you mentally unstable dipshits wouldn’t understand
It also consumed the “r” on compute(r). 😭 Be afraid. Be very afraid.
explain like Im five please…
Black Hole sun won't you compute my bit rate
Or roughly .000000002183 BTC.
This is nothing compared to the CPU hours spent watching porn.
That’s nothing next to calculating Matt Bellamy’s ego.
And how about the latest Pixar film? This is a dumb metric.
Blackholes using compruters now hide yo chilren hide yo wife. They computing errbody.
Explain in light years
One CPU year per parsec.
Linux
Tldr; they created an image from light that had escaped a black hole 400 million times the size of our sun. I did something similar with photoshop and it took me 1/10th of a human hour
How many graphics card hours is that?
[удалено]
Or about the same amount of CPU time equivalent to mining 10 Bitcoin 🤣🤣🤣.
It seems like a black hole would consume an immeasurable amount of something so familiar. I don't really get it. Plus too lazy to click and read. Back to napping.
explain in destiny 2 terms
So you know how when you land in the tower and you need to go see Hawthorne but you’re not sure if you should go to the main tower area or fast travel to the far right side? Like you’re not sure which is the faster path? Alright, so basically it’s whether we wanted it or not, we've stepped into a war with the Cabal on Mars. So let's get to taking out their command, one by one. Valus Ta'aurc. From what I can gather he commands the Siege Dancers from an Imperial Land Tank outside of Rubicon. He's well protected, but with the right team, we can punch through those defenses, take this beast out, and break their grip on Freehold.
Warmind shadowkeep strike dungeon raid forsaken beyond light
Now do fast and furious terms please
Family
Gearshift gearshift turbo spool stage 2 redline
And it looks like shit lol.
well yeah its a black hole millions of light years away, not a nice looking rock in your front yard
It’s “only” 27,000 light years away.