Dataset Viewer
Auto-converted to Parquet Duplicate
image_id
int64
1
1.27k
image
imagewidth (px)
653
7.95k
file_name
stringlengths
7
53
width
int32
653
7.95k
height
int32
490
5.76k
tile_row
int32
tile_col
int32
file_name_original
stringclasses
0 values
width_original
int32
height_original
int32
objects
dict
1
IMG_8896.jpg
3,024
4,032
null
null
null
null
null
{ "id": [ 1, 2, 3, 4, 5, 6, 7, 8 ], "category_id": [ 2, 6, 6, 6, 6, 6, 6, 2 ], "bbox": [ [ 1190, 2186, 258, 1846 ], [ 1236, 3438, 172, 401 ], [ 1271, 3445, 126,...
2
IMG_0345.jpg
3,024
4,032
null
null
null
null
null
{ "id": [ 9, 10, 11, 12, 13, 14, 15, 16 ], "category_id": [ 2, 2, 2, 2, 2, 2, 6, 6 ], "bbox": [ [ 1549, 198, 294, 2939 ], [ 1182, 2904, 88, 989 ], [ 0, 1304, 24...
3
IMG_0251.jpg
3,024
4,032
null
null
null
null
null
{"id":[17,18,19,20],"category_id":[2,2,2,2],"bbox":[[254.0,1643.0,2770.0,338.0],[834.0,855.0,15.0,17(...TRUNCATED)
4
IMG_0986.jpg
3,024
4,032
null
null
null
null
null
{"id":[21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,(...TRUNCATED)
5
IMG_8999.jpg
3,024
4,032
null
null
null
null
null
{"id":[58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75],"category_id":[2,2,2,2,2,2,2,2,2,2,2,2(...TRUNCATED)
6
20230130_085457.jpg
4,128
3,096
null
null
null
null
null
{"id":[76,77,78],"category_id":[2,2,2],"bbox":[[0.0,1651.0,1930.0,1410.0],[1876.0,0.0,1978.0,1707.0](...TRUNCATED)
7
20230105_094522.jpg
4,312
5,760
null
null
null
null
null
{"id":[79,80,81],"category_id":[2,2,1],"bbox":[[1385.0,344.0,2927.0,5026.0],[515.0,365.0,1008.0,1042(...TRUNCATED)
8
IMG_0944.jpg
3,024
4,032
null
null
null
null
null
{"id":[82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109(...TRUNCATED)
9
20221216_115101.jpg
5,760
4,312
null
null
null
null
null
{"id":[112,113,114,115],"category_id":[2,6,6,1],"bbox":[[0.0,59.0,5760.0,4066.0],[884.0,1466.0,866.0(...TRUNCATED)
10
20210807_094151.jpg
4,128
2,322
null
null
null
null
null
{"id":[116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,1(...TRUNCATED)
End of preview. Expand in Data Studio

Cracks in the Foundation

A civil-infrastructure visual inspection dataset for instance segmentation with 6 defect/condition categories: Algae · Crack · Net-Crack · Crack with Precipitation · Rust · Spalling

Each sample is either a full-resolution inspection image or a 1024×1024 tile derived from one. Tiled samples carry extra fields (tile_row, tile_col, file_name_original, …) that are None for full-resolution samples.


Splits

Each split is its own parquet shard and its own dataset config. load_dataset(repo) returns all six in a DatasetDict. Naming a config — load_dataset(repo, "train_tiled", split="train") — is a true selective download, fetching only that shard.

Split Contents
train_full full-resolution training images
val_full full-resolution validation images
test_full full-resolution test images
train_tiled 1024×1024 tiles, training
val_tiled 1024×1024 tiles, validation
test_tiled 1024×1024 tiles, test

Load

from datasets import load_dataset

# Load all six splits at once (single DatasetDict):
all_splits = load_dataset("ibm-research/cif-dataset")

# Selective: download only the tiled training shard.
# Each split is also exposed as its own config — naming a config
# downloads only its parquet files.
ds = load_dataset("ibm-research/cif-dataset", "train_tiled", split="train")

Schema

Every sample has the same fields regardless of split:

sample = ds[0]

sample["image_id"]           # int   — unique image identifier
sample["image"]              # PIL.Image
sample["file_name"]          # str   — original filename
sample["width"]              # int   — image width in pixels
sample["height"]             # int   — image height in pixels

# Tiled-only fields (None for full-resolution samples):
sample["tile_row"]           # int | None  — top-left row of the tile in the original image
sample["tile_col"]           # int | None  — top-left column
sample["file_name_original"] # str | None  — filename of the parent image
sample["width_original"]     # int | None  — parent image width
sample["height_original"]    # int | None  — parent image height

# Annotations (COCO convention):
obj = sample["objects"]
obj["id"]            # List[int]
obj["category_id"]   # List[int]   — 1=Algae 2=Crack 3=Crack(net) 4=Crack+precip 5=Rust 6=Spalling
obj["bbox"]          # List[[x, y, w, h]]   — pixels, COCO origin (top-left)
obj["area"]          # List[float]
obj["iscrowd"]       # List[int]
obj["segmentation"]  # List[List[List[float]]]  — polygons as flat [x1,y1,x2,y2,...] lists

Distinguish sample type at runtime:

is_tile = sample["tile_row"] is not None

Visualize

pip install datasets fiftyone
import tempfile
from pathlib import Path

import fiftyone as fo
from datasets import load_dataset

CATS = {1: "Algae", 2: "Crack", 3: "Crack (net-crack)",
        4: "Crack with precipitation", 5: "Rust", 6: "Spalling"}

ds = load_dataset("ibm-research/cif-dataset", split="test_full")

tmp = Path(tempfile.mkdtemp())
fo_ds = fo.Dataset("cif_test_full", overwrite=True)

for s in ds:
    img_path = tmp / Path(s["file_name"]).name
    s["image"].save(img_path)
    W, H = s["width"], s["height"]
    dets, polys = [], []
    obj = s["objects"]
    for i, cid in enumerate(obj["category_id"]):
        label = CATS.get(cid, str(cid))
        x, y, w, h = obj["bbox"][i]
        dets.append(fo.Detection(label=label, bounding_box=[x/W, y/H, w/W, h/H]))
        for poly in obj["segmentation"][i]:
            if len(poly) < 6:
                continue
            pts = [[poly[j]/W, poly[j+1]/H] for j in range(0, len(poly), 2)]
            polys.append(fo.Polyline(label=label, points=[pts], filled=True, closed=True))
    fo_ds.add_sample(fo.Sample(
        filepath=str(img_path),
        detections=fo.Detections(detections=dets),
        segmentations=fo.Polylines(polylines=polys),
    ))

session = fo.launch_app(fo_ds)
session.wait()

Opens the FiftyOne app at http://localhost:5151 with bounding boxes and segmentation overlays.


Acknowledgment

We would like to sincerely thank Finn Bormlund and Svend Gjerding (Sund & Baelt), Jens Häggström (Trafikverket), Raphael von Thiessen (Innovation-Sandbox for AI, Office for Economy, Kanton Zürich), and the Dübendorf Air Base for granting us the opportunity to collect, analyze, and disseminate the images and defect data included in this publication.


Citation

@dataset{cracks_in_the_foundation,
  author    = {},
  title     = {Cracks in the Foundation},
  year      = {2025},
  publisher = {HuggingFace},
  url       = {https://huggingface.co/datasets/ibm-research/cif-dataset},
}
Downloads last month
20