Datasets:

Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    ArrowNotImplementedError
Message:      Cannot write struct type 'context' with no child field to Parquet. Consider adding a dummy child field.
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1887, in _prepare_split_single
                  writer.write_table(table)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 673, in write_table
                  self._build_writer(inferred_schema=pa_table.schema)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 714, in _build_writer
                  self.pa_writer = pq.ParquetWriter(
                                   ^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
                  self.writer = _parquet.ParquetWriter(
                                ^^^^^^^^^^^^^^^^^^^^^^^
                File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'context' with no child field to Parquet. Consider adding a dummy child field.
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1908, in _prepare_split_single
                  num_examples, num_bytes = writer.finalize()
                                            ^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 688, in finalize
                  self._build_writer(self.schema)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 714, in _build_writer
                  self.pa_writer = pq.ParquetWriter(
                                   ^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
                  self.writer = _parquet.ParquetWriter(
                                ^^^^^^^^^^^^^^^^^^^^^^^
                File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'context' with no child field to Parquet. Consider adding a dummy child field.
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1347, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 980, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 884, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 947, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1736, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1919, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

id
string
categories
list
input
dict
output
dict
harness
dict
tt06-CEJMU-tt06_tinyrv1-task_alu_0001
[ "tt06" ]
{ "prompt": "You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. You are given the full source code of a Verilog project. One of the modules in the project is incomplete — its implementation is missing and marked with:\n\n// >>> Module Implementation Begin\n\n// <<< Module Implemen...
{ "response": "", "context": { "rtl/generated.v": "wire _00_;\n wire _01_;\n wire _02_;\n wire _03_;\n wire _04_;\n wire _05_;\n wire _06_;\n wire _07_;\n wire _08_;\n wire _09_;\n wire _10_;\n wire _11_;\n wire _12_;\n wire _13_;\n wire _14_;\n wire [31:0] _15_;\n wire _16_;\n wire _17_;\n wi...
{ "files": { "Dockerfile": "FROM __OSS_SIM_IMAGE__", "src/golden_solution.v": "/* Generated by Yosys 0.39+165 (git sha1 22c5ab90d, clang++ 14.0.0-1ubuntu1.1 -fPIC -Os) */\n\nmodule alu(clk, reset, a, b, instruction, rd);\n wire _00_;\n wire _01_;\n wire _02_;\n wire _03_;\n wire _04_;\n wire _05_;\n wi...
tt06-CEJMU-tt06_tinyrv1-task_control_0002
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"wire _000_;\n wire _001_;\n wire _002_;\n wire [2:0](...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"/* Generated by Yosys 0.39+(...TRUNCATED)
tt06-CEJMU-tt06_tinyrv1-task_cpu_0003
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"wire [31:0] _00_;\n wire [31:0] _01_;\n wire [12:0] _(...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"/* Generated by Yosys 0.39+(...TRUNCATED)
tt06-CEJMU-tt06_tinyrv1-task_instructioncounter_0004
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"wire [15:0] _00_;\n wire [15:0] _01_;\n wire _02_;\n (...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"/* Generated by Yosys 0.39+(...TRUNCATED)
tt06-CEJMU-tt06_tinyrv1-task_regs_0005
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"wire [31:0] _000_;\n wire [31:0] _001_;\n wire [31:0](...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"/* Generated by Yosys 0.39+(...TRUNCATED)
tt06-CEJMU-tt06_tinyrv1-task_spi_master_0006
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"wire _000_;\n wire _001_;\n wire _002_;\n wire _003_(...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"/* Generated by Yosys 0.39+(...TRUNCATED)
tt06-CEJMU-tt06_tinyrv1-task_tt_um_cejmu_riscv_0007
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"wire _0_;\n wire _1_;\n wire [31:0] _2_;\n wire [13:(...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"/* Generated by Yosys 0.39+(...TRUNCATED)
tt06-CKPope-tt06-verilog-template-task_Compx1_0008
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"always @(*)\nbegin\n\taeqb = (a & b) | ((!(a)) & (!(b))(...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"module tt_um_CKPope_top \n((...TRUNCATED)
tt06-CKPope-tt06-verilog-template-task_Compx4_0009
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"wire [3:0] aeqb;\t\n\twire [3:0] agtb;\t\n\twire [3:0] (...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"module tt_um_CKPope_top \n((...TRUNCATED)
tt06-CKPope-tt06-verilog-template-task_Mealy_SM_0010
[ "tt06" ]
{"prompt":"You are an expert Verilog hardware designer working on a complete FPGA or ASIC project. Y(...TRUNCATED)
{"response":"","context":{"rtl/generated.v":"parameter INITIALIZE1 =3'b000,\n\t\t\t\tINITIALIZE2 =3'(...TRUNCATED)
{"files":{"Dockerfile":"FROM __OSS_SIM_IMAGE__","src/golden_solution.v":"module tt_um_CKPope_top \n((...TRUNCATED)
End of preview.

This is a version of NotSoTiny-25-12 benchmark modified to work with Nvidia's CVDP framework

If you plan to run this benchmark via CVDP framework, proceed with this version. Otherwise refer to the main dataset for additional info: HPAI-BSC/NotSoTiny-25-12

Subsets and Shuttles

The default dataset contains all NotSoTiny-25-12 Tiny Tapeout shuttles combined onto a single dataset comprising 1114 total tasks. However, you can also access individual shuttles as subsets.

Subset # Tasks Launched date Tiny Tapeout source
default 1,114 - -
tt06 108 2024-01-30 https://tinytapeout.com/chips/tt06/
tt07 177 2024-04-22 https://tinytapeout.com/chips/tt07/
tt08 196 2024-06-10 https://tinytapeout.com/chips/tt08/
tt09 250 2024-09-07 https://tinytapeout.com/chips/tt09/
tt10 214 2025-03-12 https://tinytapeout.com/chips/ttihp25a/ (and ttihp0p2)
ttsky 169 2025-06-27 https://tinytapeout.com/chips/ttsky25a/

Run it with CVDP

1. Clone the CVDP framework:

$ git clone https://github.com/ggcr/cvdp_benchmark
$ cd cvdp_benchmark/
$ uv init && uv add -r requirements.txt

2. Download the dataset shuttles:

$ git xet install
$ git clone https://huggingface.co/datasets/HPAI-BSC/NotSoTiny-25-12-CVDP
$ ls NotSoTiny-25-12-CVDP/shuttles/
    tt06.jsonl   tt07.jsonl   tt08.jsonl   tt09.jsonl   tt10.jsonl   ttsky.jsonl
$ cp .env.example .env && echo "OSS_SIM_IMAGE=ggcr0/turtle-eval:2.3.4" >> .env

3. [Optional] Validate with golden solutions:

$ uv run run_benchmark.py -f NotSoTiny-25-12-CVDP/shuttles/tt06.jsonl

4. Run the benchmark (inference + eval):

$ export OPENROUTER_API_KEY=sk-or-v1-... 
$ uv run run_samples.py \
       -f NotSoTiny-25-12-CVDP/shuttles/tt06.jsonl \
       -l \
       -m mistralai/codestral-2508 \
       -c examples/openrouter_factory.py \
       -n 3 \
       -k 1

You can repeat this process for any other shuttle present in the shuttles/ dir.

We also recommend the usage of the -t <workers> flag to speed-up the process.

Additional Usage

from datasets import load_dataset

# Load the complete dataset with all shuttles (1,114 tasks)
ds = load_dataset("HPAI-BSC/NotSoTiny-25-12-CVDP", split="test")

# Or load a specific shuttle
ds_tt06  = load_dataset("HPAI-BSC/NotSoTiny-25-12-CVDP", "tt06",  split="test") # 108 tasks
ds_tt07  = load_dataset("HPAI-BSC/NotSoTiny-25-12-CVDP", "tt07",  split="test") # 177 tasks
ds_tt08  = load_dataset("HPAI-BSC/NotSoTiny-25-12-CVDP", "tt08",  split="test") # 196 tasks
ds_tt09  = load_dataset("HPAI-BSC/NotSoTiny-25-12-CVDP", "tt09",  split="test") # 250 tasks
ds_tt10  = load_dataset("HPAI-BSC/NotSoTiny-25-12-CVDP", "tt10",  split="test") # 214 tasks
ds_ttsky = load_dataset("HPAI-BSC/NotSoTiny-25-12-CVDP", "ttsky", split="test") # 169 tasks

License

The dataset is released under the Apache License 2.0.

Citation Information

@misc{ghorab2025notsotinylargelivingbenchmark,
      title={NotSoTiny: A Large, Living Benchmark for RTL Code Generation}, 
      author={Razine Moundir Ghorab and Emanuele Parisi and Cristian Gutierrez-Gomez and Miquel Albert\'i-Binimelis and Miquel Moreto and Dario Garcia-Gasulla and Gokcen Kestor},
      year={2025},
      eprint={2512.20823},
      archivePrefix={arXiv},
      primaryClass={cs.AR},
      url={[https://arxiv.org/abs/2512.20823](https://arxiv.org/abs/2512.20823)}, 
}

Acknowledgements

The HPAI team behind NotSoTiny would like to thank the Tiny Tapeout community for the open source efforts, which made possible this contribution. Special thanks to Matt Venn for his support.

We would also like to thank the CVDP team at NVIDIA for developing and open-sourcing the CVDP framework and to the Si2 Coalition Extend and Expand working group.

Downloads last month
101

Collection including HPAI-BSC/NotSoTiny-25-12-CVDP

Paper for HPAI-BSC/NotSoTiny-25-12-CVDP