Dataset upload stuck in Processing. Platform became unusable

Hi,

I uploaded multiple videos to create a dataset on Ultralytics Platform.

During the upload, several red error messages appeared and the dataset moved into Processing.

After that point, my account entered a broken state.

I recorded a video showing the full sequence and the current state:

The dataset has remained stuck in Processing for over 24 hours.

Refreshing the page, logging out, and logging back in do not change anything.

There is no option to stop, cancel, delete, or recover the dataset. I cannot proceed with model training while the workspace remains in this state.

I am sharing this to check if anyone has seen similar behavior. Thanks!

Thanks for the report. Forwarded it to the team.

@Alex_Motion Can you try now?

I can’t reply you with my main account as the system indicated that I am a spam after I posted my issue in the Forum.. I guess this is another bug that must be reported to your Forum admins.

The issue is still there. Here is the video recording of the issue:

Did this issue occur when trying to upload additional images to existing dataset?

Should be fixed now

1 Like

happens for me too

refrence file

https://files.catbox.moe/1wskai.zip

when I locally unzip and train it works — but im not rich I like the access ultralytics give you for very high end gpus and the interface is amzzzzzzzinngggg

GOOD JOB ON THE NEW PLATFORM site

GOOD JOB!

look i didnt upload 300,000 images, just 5000 , and each file is 9kb

LLM PROMPT IDEA````

give me a service idea for a service where you ingest ZIP or TAR files and , make them a dataset

should allow large dataset of 30,000 files , should allow jobs of a lot of users —

think like a UNIX WIZZARD you are an expect with bash and pipelines —

List 5 common issues before in the plan , and to avoid it
`````

Thanks for sharing the reference ZIP — that’s really helpful. If it unzips and trains locally, this does look more like an Ultralytics Platform ingestion issue than a dataset-format issue. 5,000 tiny files should still be well within the supported upload flow; Platform accepts .zip archives up to 10 GB and then runs validation, label parsing, and stats generation as described in the dataset upload docs.

If you can, please confirm whether this happened on a brand-new dataset or when adding files to an existing dataset, since that was related to the earlier fix. As a temporary workaround, try uploading into a fresh dataset or splitting the archive into 2–3 smaller ZIPs.

And thanks for the kind words — credit goes to the Ultralytics team and community :handshake: