Job Description
Role Overview:
We are seeking a Junior Data Scientist to join our dynamic team. The primary responsibility of this role is to collaborate with our data science team in cleaning and processing the data that is ingested by Envelop's proprietary stochastic cyber risk model. The ideal candidate will have a keen eye for detail, an analytical mindset and a passion for data. This role is responsible for cleaning, enriching, and ingesting insurance portfolio data and the processes associated with that.
Responsibilities:
· Clean, standardize, and validate insurance portfolio data using Python (especially Pandas) to align with a common schema, ensuring consistency, completeness, and high data quality for downstream analysis.
· Develop, maintain, and improve internal utilities and functions that support the data cleaning pipeline, including code refactoring, fixing bugs, and implementing new features.
· Collaborate on initiatives involving large language models (LLMs) to automate data preparation tasks, including prompt engineering, experimentation, and evaluation of model outputs.
· Contribute to data reporting tasks, including the generation of insights related to insurance exposures and risk events.
· Conduct ad hoc analysis and communicate insights clearly to technical and non-technical stakeholders.
· Work alongside modellers who statistically analyse reinsurance portfolios.
· Apply software engineering practices to build flexible, maintainable data systems.
· Identifying opportunities for continuous improvement across data collection, processing workflows, and overall data quality.
· Engage in continuous learning to deepen knowledge of data, reinsurance, and the cyber risk landscape.
Requirements & Skills
· 1–2 years of experience in a technical role involving data handling or analysis
· Proficient in Python, with strong skills in Pandas for data manipulation and transformation.
· Familiarity with data wrangling techniques and best practices.
· Strong attention to detail and a keen eye for data quality and consistency.
· Excellent problem-solving skills and analytical thinking.
· Adaptable and eager to learn new tools, technologies, and domain knowledge.
Nice to have:
· Some reinsurance knowledge
· Experience with Databricks
· Familiarity with Git.
· PySpark / SQL
Related Jobs
No Jobs found with this criteria