Spaces:
Sleeping
Apply for community grant: Academic project (gpu)
The project demos a loading bar for reasoning models from the paper "Overclocking LLM Reasoning: Monitoring and Controlling Thinking Path Lengths in LLMs" - https://arxiv.org/abs/2506.07240.
It allows users to prompt a reasoning model with a problem and observe the thinking process accompanied by a loading bar visualization.
I am applying for this grant following a suggestion made by @NeilsRogge in our github repo: https://github.com/royeisen/reasoning_loading_bar/issues/1
Thanks
@hysts
!
This space encounters a GPU memory limit related runtime error:
The demo uses roughly 65GB:
If you could provide a device with this amount of memory it would be great. Otherwise, please let me know and I'll find another solution.
@royeis
I think the error is related to CPU RAM limit, not GPU RAM. The underlying hardware for ZeroGPU Spaces is half H200 in the sense of MIG, so it has roughly 70GB VRAM.
Unfortunately, there's no grantable hardware that meets both the CPU and GPU RAM requirements. Hopefully, you'll be able to find another way.
