The dataset viewer is not available for this subset.
Exception: ReadTimeout
Message: (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: 8de9c47e-c0a7-4745-ab43-45c3194e3bf8)')
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 65, in compute_split_names_from_streaming_response
for split in get_dataset_split_names(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 343, in get_dataset_split_names
info = get_dataset_config_info(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 268, in get_dataset_config_info
builder = load_dataset_builder(
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1132, in load_dataset_builder
dataset_module = dataset_module_factory(
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1031, in dataset_module_factory
raise e1 from None
File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1004, in dataset_module_factory
).get_module()
^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 632, in get_module
data_files = DataFilesDict.from_patterns(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 689, in from_patterns
else DataFilesList.from_patterns(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 592, in from_patterns
origin_metadata = _get_origin_metadata(data_files, download_config=download_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 506, in _get_origin_metadata
return thread_map(
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/tqdm/contrib/concurrent.py", line 69, in thread_map
return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/tqdm/contrib/concurrent.py", line 51, in _executor_map
return list(tqdm_class(ex.map(fn, *iterables, chunksize=chunksize), **kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/tqdm/std.py", line 1169, in __iter__
for obj in iterable:
^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 619, in result_iterator
yield _result_or_cancel(fs.pop())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 317, in _result_or_cancel
return fut.result(timeout)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 59, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 485, in _get_single_origin_metadata
resolved_path = fs.resolve_path(data_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 198, in resolve_path
repo_and_revision_exist, err = self._repo_and_revision_exist(repo_type, repo_id, revision)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 125, in _repo_and_revision_exist
self._api.repo_info(
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_api.py", line 2816, in repo_info
return method(
^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_api.py", line 2673, in dataset_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 96, in send
return super().send(request, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/adapters.py", line 690, in send
raise ReadTimeout(e, request=request)
requests.exceptions.ReadTimeout: (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: 8de9c47e-c0a7-4745-ab43-45c3194e3bf8)')Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Mauritius IT Jobs Dataset
This dataset contains IT job listings scraped from myjob.mu, the primary job board in Mauritius.
This dataset is part of the MyJobViz project, which provides visualization and analysis tools for the Mauritius IT job market.
Dataset Structure
Each job listing contains the following fields:
| Field | Type | Description |
|---|---|---|
job_title |
string | Title of the job position |
company |
string | Name of the hiring company |
date_posted |
datetime | When the job was posted (Unix timestamp in ms) |
closing_date |
datetime | Application deadline (Unix timestamp in ms) |
location |
string | Job location within Mauritius |
employment_type |
string | Type of employment (e.g., Full-time, Trainee, Contract) |
salary |
string | Salary range or information |
job_details |
string | Full job description and requirements |
url |
string | Original job posting URL |
timestamp |
datetime | When the job was scraped (Unix timestamp in ms) |
Example Entry
{
"job_title": "it helpdesk support - raw it services ltd",
"company": "TAYLOR SMITH GROUP",
"date_posted": 1676332800000,
"closing_date": 1678924800000,
"location": "Port Louis",
"employment_type": "Trainee",
"salary": "10,000 - 20,000",
"job_details": "Your role will consist of providing support to the technical team...",
"url": "http://myjob.mu/Jobs/IT-Helpdesk-Support-Raw-IT-141921.aspx",
"timestamp": 1676427651200
}
Files
jobs.json: Complete job listings in JSON formatjobs.csv: Complete job listings in CSV format (easier to preview)metadata.json: Backup metadata and statistics
Usage
Load with HuggingFace Datasets
from datasets import load_dataset
# Load the dataset
dataset = load_dataset("goated69/mauritius-it-jobs")
print(dataset)
Load with Pandas
import pandas as pd
# From CSV
df = pd.read_csv("hf://datasets/goated69/mauritius-it-jobs/jobs.csv")
# From JSON
df = pd.read_json("hf://datasets/goated69/mauritius-it-jobs/jobs.json")
# Convert timestamps to datetime
df['date_posted'] = pd.to_datetime(df['date_posted'], unit='ms')
df['closing_date'] = pd.to_datetime(df['closing_date'], unit='ms')
df['timestamp'] = pd.to_datetime(df['timestamp'], unit='ms')
Potential Use Cases
- Job Market Analysis: Analyze trends in Mauritius IT job market
- Salary Research: Study compensation patterns for IT roles
- Skills Demand: Identify most in-demand technologies and skills
- Location Analysis: Understand geographic distribution of IT jobs
- Time Series Analysis: Track job posting trends over time
- NLP Tasks: Job description classification, entity extraction, skill identification
- Career Planning: Research companies, job titles, and requirements in Mauritius IT sector
Data Collection
Data is collected through automated web scraping of myjob.mu, focusing exclusively on IT-related job postings. The scraper runs periodically to capture new job listings.
Scraping Code: The code used to scrape and process this data is available at github.com/creme332/myjobviz
Limitations
- Only includes IT jobs (other sectors not included)
- Geographic scope limited to Mauritius
- Dependent on data availability from source website
- Some fields may contain inconsistent formatting (e.g., whitespace in location field)
- Salary information may be missing or in various formats
Statistics
The statistics.json file (if included) contains aggregated data on:
- Programming languages mentioned in job descriptions
- Technologies and frameworks (databases, cloud platforms, tools)
- Job title distributions
- Location distributions
- Salary ranges
- And more...
Citation
If you use this dataset in your research or analysis, please cite:
Mauritius IT Jobs Dataset, scraped from myjob.mu
Part of the MyJobViz project: https://github.com/creme332/myjobviz
Available at: https://huggingface.co/datasets/goated69/mauritius-it-jobs
License
This dataset is provided under CC-BY-4.0 license. The data is scraped from publicly available job listings.
Backup History
This dataset is automatically updated with regular backups to preserve historical job market data.
Last Updated: 2026-01-01_03-30-50
- Downloads last month
- 18