We have just started working with a pretty interesting InsurTech company based out of Berlin. Insurance companies need a facelift and I don’t mean getting a mobile app and saying hey we are digital, a real facelift where everything is personalised to each and every customer – this is what this company is doing.
Everything is transparent, paperless and seamless. Good crowd, I’d a good chat with their CTO, nice guy and am friends with one of the engineers in there, it’s supposed to be a great place to work.
You’ll be involved in:
- Data pipeline management – integrate third-party systems and APIs (inbound and outbound) into their internal systems and data lake using Airflow and Python scripts
- Data lake governance – own data sourcing, validation, processing & ingestion into data lake (S3)
- Define and implement data reconciliation solutions between data sources and data warehouse
- Collaborate with BI Analysts on data warehouse management
- Build reports based on S3 and Redshift data
- Build data quality monitoring and alerting for incoming integrations to ensure validity within our system
- Refactor an existing project into python (validation and processing of incoming data with alerting, Lambda functions to integrate with outside APIs)
What you need to have (in an ideal world bud don’t worry if you don’t have it all):
- Experience in working with data pipelines and data management systems
- Experience building and maintaining backend Python services
- Experience building and deploying Dockerised applications
- Experience with AWS services (CLI, S3, Redshift, Lambda)
Ideally, they are looking to get this boxed off ASAP.
Apply on the link or hit me up on email@example.com for a chat.