EA · GW
Founded Nonlinear.org (we're hiring!) to help scale AI Safety research progress
That depends on the funders! Give enough bounties, I'd expect an optimal bounty distribution to look power law-ish with a few big bounties (>10k-1m?) and many small ones (<10k).
I didn't think about it much - public might be better. I assumed some people would be hesitant to share publicly and I'd get more submissions if private, but I'm not sure if that offsets the creative stimulus of sharing publicly.
I actually think that's an interesting idea! I like the idea of using bounties to spur more bounty innovation. I'd love to see more bounties like this - let's try mapping the whole design space.
Shovel ready bounties are preferred but to avoid premature exploitation I'd just like to hear as many ideas as possible at this point. Some ideas might require back and forth, but that's ok!Seeing the ideas coming in is already giving me lots of ideas for ways to potentially scale this.