Joseph Wang, Ex-VP Quant - Banque d'investissement - Hong Kong
Répondu il y a 197w · L'auteur dispose de réponses 15.7k et de vues de réponses 43.1m
Supercomputers aren't good for mining. Also you can think of a supercomputer as a set of highly networked nodes, so without software the supercomputer wouldn't be that useful.
What I'd probably do is to get some commercial ray tracing software and write an animated Valentine's day card. Either that or just run some prime number finding software.
If you give me a week's notice, I can start programming something more useful. What I would likely do is to take the MESA stellar evolution code, and do some parameter searches.
One other thing is that I worked at a big investment bank, and I can tell you what they used the supercomputers for. It was risk management calculations. You have one derivative security. You can calculate the price of that on a desktop system. The bank has tens of thousands of these securities, and the Fed wants to know what happens under condition X, and the traders want to know how much they are likely to make or lose on a typical day. So you need to run massive numbers of scenarios. This is the type of thing that a supercomputer is perfect for.
What is interesting is that the supercomputers at the bank that I worked at were Intel systems with Nvidia GPU's and ran the same software as the desktop systems. So if you wanted to price one set of options you did it on your desktop, whereas once you booked the option, the banks supercomputer would start to risk manage it with the ton of other securities you had.
Supercomputers aren't used for algo trading. Mainframes are.
Répondu il y a 205w
That's what I do with the (only somewhat limited) access I have to a supercomputer all the time. I'm nothing special, I just work at a university. Research is probably what most supercomputers in the world are used for.
This obviously depends what you mean by supercomputer. Ours is quite modest: 980 compute cores with 4Gb RAM per core. Never going to break any records (e.g. Tianhe-2 has 3 million cores), but still pretty quick. Most good universities have something at least this big.
There are no hard facts that I can find due to the difficulty of defining what is a supercomputer and lack of info on who owns what and what they do with them. But clearly universities will be amongst the best customers.
This is because there is a balance between the cost of a supercomputer and the added-value gained through the use of a supercomputer. What company faces problems so difficult that they need supercomputing power to get a good enough answer, and also has the money to afford one? Pixar for all those renderings. Boeing and F1 teams for CFD. GSK for molecular modelling. Some big players in finance, presumably (they're not saying). But the number of such big, rich, specialist companies is small, whereas the number of universities is large.
Research is about finding things out, and these days running better (or more) simulations might improve the quality of data somewhat. Being 1% more accurate is important here, so you buy a bigger computer, whereas finding commercial cases where that sort of accuracy can pay for itself is much rarer. Industry often puts lots of effort into improving the approximations used to deliver meaningful (but not perfect) results to clients in a cost and time effective manner. Academics don't care so much about the time, because getting one really good answer is worth waiting for.
Also, academia has a different mentality when it comes to funding. Finding $1m for an impressive bit of kit can be easier than finding $2k to go to a conference. Big expensive equipment is often provided in the hope that someone will do something useful and/or impressive with it. The benefits cannot be quantified, certainly not in advance, so this sidesteps the cost-benefit analysis that would squash things like this in the real world. Surprisingly often, "because it's really cool!" is a good enough reason to buy something.
(This is specifically discounting the web infrastructure that the datacenters of Google and Facebook et al run: they are built to do huge numbers of tiny things in a very distributed way, whereas I'm talking about machines that can do single big tasks, albeit in a parallelised form.)
Addendum: Supercomputers aren't that good at crypto-mining - they're general-purpose tools that aren't going to compete with specialised rigs.
@Why can't they use super computers to mine all the bitcoins?
Philip Howie, materials scientist, academic, researcher
Répondu il y a 121w · L'auteur dispose de réponses 1.5k et de vues de réponses 2.3m
I used to do research into dislocation plasticity in crystalline materials. I was interested in finding out what makes one material harder than another, at a fundamental atomic level.
Part of this research involved density functional theory (DFT) modelling of interactions between atoms at the core of a dislocation. I used the CamGrid high-throughput computing resource at the University of Cambridge, which gave me access to a few dozen cores. With this, I managed to simulate elemental metals and a few simple intermetallic compounds.
I still have some of my old codes knocking around, and I can still generate a crystal structure in a few minutes. Given some more computing grunt for a day, I’d probably knock up something much more complex (Ta39Al69?) and simulate dislocation motion. If nothing else, the resulting paper would make a lot of computational materials scientists jealous!
Jack Dahlgren, travaille chez NVIDIA
Mise à jour il y a 205w · Voté par
William Emmanuel Yu, computer networks teacher · Author has 7k answers and 9.5m answer views
Read a book or play guitar or go for a walk. Maybe answer a few questions here.
Thing is. I have the equivalent of a supercomputer (from the last decade) sitting at home. The GPU in my computer can run a few teraflops.
I don't do any super computing with it. Sure I could mine coins, but that is a waste of energy in my opinion. The big problem in supercomputing is defining the problem and building the model. Both of those things take more than 24 hours and I just don't have any pressing needs which would justify me spending months or years working to set up a problem like that.
UPDATE: Looks like the people down the hall from me are working on making the fastest supercomputer in the world.
Chip Frank, works at Computer Programming
Répondu il y a 205w · L'auteur dispose de réponses 1.2k et de vues de réponses 1.3m
It would depend on the specific SuperComputer.
There are certain configurations I would study. (As in, I would not be putting those resources to any good use, but, I'd be looking into how the system was setup and configured.)
Anything with a heavily parallelized segmentation of hard ware compute and storage resources... I'd be looking at everything from settings & configuration to kernel configuration... whatever I could extract -- this stuff fascinates me.
Anything I was familiar with, such as FermiLinux or Scientific Linux (CERN) -- I'd probably just put some BONIC client going on it to help-out with some project or other (DNETC, PrimeGrid or the likes).
The latter, consumer-available supercomputer configurations (clusters) are of no interest to me. (I've studied clusters for years, finally gave it all up and have a couple of aging big-iron boxes I kick-on every once and a while to keep my skills up and test new things I haven't heard of that become publicly available, etc...)
But anything I don't have experience with is fair game for me to play with if I could do whatever I wanted for 24 hours.
Jae Alexis Lee, This is what I actually went to school for!
Répondu il y a 205w · L'auteur dispose de réponses 4.3k et de vues de réponses 21.8m
Nothing worth that much horsepower. I'd probably finish rendering art assets for my game. It's not a big idea, it's not impressive, and it wouldn't require a super computer to do it. I've recently moved to an i7-5960X, and it churns out significantly better than my previous render farm did, but it still puts delays in my production process simply because of computational time.
But I don't need a supercomputer worth of processing power. It would be nice though, for a period of time, to not have hardware be the bottleneck on what I could create.