Posted:3 days ago| Platform:
Work from Office
Full Time
Summary Guidewire is searching for a unique individual who is ambitious, curious, and hungry for a rare chance to transform a 500-year-old industry from the inside out. Through our data listening capabilities, we collect more data (and more important data) than any other company in our market. We seek ways to make sense of it, showcase it, and transform it into insight that feeds billions of decision points every year across pricing, portfolio management, underwriting, claims management, and risk transfer. At Guidewire, we offer a combination of good working conditions, an excellent market opportunity, a rational and meritocratic company culture, quality software products, and a long history of careful hiring have allowed us to create an enviable work environment. Guidewire Analytics helps insurers and other financial institutions to model new and evolving risks such as cyber. By combining internet-scale data listening, adaptive machine learning, and insurance risk modeling, Guidewire Analytics insights help P&C customers face new risks, take advantage of new opportunities and develop new products. Job Description Responsibilities : Development: Develop robust, scalable, and efficient data pipelines. Manage platform solutions to support data engineering needs to ensure seamless integration and performance. Write clean, efficient, and maintainable code. Data Management and Optimization: Ensure data quality, integrity, and security across all data pipelines. Optimize data processing workflows for performance and cost-efficiency. Develop and maintain comprehensive documentation for data pipelines and related processes. Innovation and Continuous Improvement: Stay current with emerging technologies and industry trends in big data and cloud computing. Propose and implement innovative solutions to improve data processing and analytics capabilities. Continuously evaluate and improve existing data infrastructure and processes. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. 2+ years of experience in software engineering with a focus on data engineering and building data platform Strong programming experience using Python or Java . Experience of the Big data technologies like Apache Spark , Amazon EMR , Apache Iceberg , Amazon Redshift , etc or Similar technologies Experience in RDBMS (Postgres, MySql, etc) and NoSQL (MongoDB, DynamoDB, etc) database Experience in AWS cloud services (e.g., Lambda , S3 , Athena , Glue ) or comparable cloud technologies. Experience in CI/CD . Experience working in Event driven and Serverless Architecture Experience with platform solutions and containerization technologies (e.g., Docker , Kubernetes). Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. Strong communication skills, both written and verbal.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
5.0 - 9.0 Lacs P.A.
5.0 - 7.0 Lacs P.A.
5.0 - 9.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
Bengaluru
4.0 - 7.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
Bengaluru
4.0 - 7.0 Lacs P.A.
3.0 - 5.0 Lacs P.A.
Pune, Maharashtra, India
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed