Jobs
Interviews

6 Astronomer Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

Join our dynamic team as a software developer, where you will have the opportunity to solve complex problems and contribute to our innovative projects. Enhance your skills in Python, PySpark, and cloud architecture while working in an inclusive and respectful team environment. This role offers immense growth potential and a chance to work with cutting-edge technologies. As a Lead Software Engineer - Python / Spark Big Data at JPMorgan Chase within the Capital Reporting product, you will be executing software solutions, designing, developing, and troubleshooting technical issues. Value diversity, equity, inclusion, and respect in our team culture. This role provides an opportunity to contribute to software engineering communities of practice and events that explore new and emerging technologies. Proactively identify hidden problems and patterns in data and use these insights to promote improvements to coding hygiene and system architecture. Job Responsibilities - Participate in all aspects of the software development process including requirements, designing, coding, unit testing, quality assurance, and deployment. - Use the right mix of open-source technologies and home-grown distributed computing frameworks to build software that meets the requirements. - Contribute to the team drive for continual improvement of development process and innovative solutions to meet business needs. - Ensure adherence to architecture standards, application robustness, and security policies. - Gather, analyze, and draw conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development. - Utilize agile software development methodologies such as SCRUM for quick turnaround time. - Manage a team of software engineers and build a high-performing/winning team. - Add to the team culture of diversity, equity, inclusion, and respect. Required Qualifications, Capabilities, And Skills - Formal training or certification on software engineering concepts and 5 years applied experience. - Hands-on development experience in Python or PySpark, cloud or microservices architecture concepts. - Demonstrated knowledge of software applications and technical processes within a cloud or microservices architecture. - Hands-on practical experience in system design, application development, testing, and operational stability. - Help identify opportunities for improvement within the existing applications to increase stability and simplify the platform. - Work with a team of engineers and developers to ensure that the Capital Risk platform is engineered to be standardized, optimized, available, reliable, consistent, accessible, and secure to support business and technology needs. - Provide operational excellence through root cause analysis and continuous improvement. - Stay current and informed on emerging technologies and new techniques to refine and improve overall delivery. - Interact with partners across feature teams to collaborate on reusable services to meet solution requirements. Preferred Qualifications, Capabilities, And Skills - Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka). - Experience with Big Data solutions (e.g., Databricks) or Relational DB. - Experience in the Financial Service Industry is nice to have. Locations - Mumbai, Maharashtra, India,

Posted 1 week ago

Apply

3.0 - 8.0 years

1 - 3 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Immediate Joiners Only-0-15 days Only Considered 3+Years Mandatory Work Mode-Hybrid Work Loation: Hyderabad, Bengaluru,Chennai,Pune Mandatory Skills: Azure, ADF, Spark, Astronomer Data Engineering topics Kafka based ingestion API based ingestion Astronomer, Apache Airflow, dagster, etc. (orchestration tools) Familiarity with Apache Iceberg, Delta & Hudi table designs when to use, why to use & how to use Spark architecture Optimization techniques Performance issues and mitigation techniques Data Quality topics Data engineering without quality provides no value Great Expectations (https://docs.greatexpectations.io/docs/core/introduction/try_gx/) Pydeequ (https://pydeequ.readthedocs.io/en/latest/index.html) Databricks – DLT expectations (Spark based)

Posted 1 month ago

Apply

3.0 - 5.0 years

10 - 13 Lacs

Chennai

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 months ago

Apply

7.0 - 9.0 years

13 - 17 Lacs

Chennai

Work from Office

Key Responsibilities: Design and implement scalable and efficient full-stack solutions using Java and cloud technologies. Develop and maintain cloud-based solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Astronomer, Terraform, Airflow, and Dataflow. Architect and implement complex data engineering solutions using GCP services. Collaborate with cross-functional teams to develop, deploy, and optimize cloud-based applications. Utilize Python for data engineering and automation tasks within the cloud environment. Ensure alignment with GCP architecture best practices and contribute to the design of high-performance systems. Lead and mentor junior developers, fostering a culture of learning and continuous improvement. Required Skills: Full-Stack Development (7+ years): Strong expertise in full-stack Java development with experience in building and maintaining complex web applications. Google Cloud Platform (GCP): Hands-on experience with GCP services like BigQuery, Astronomer, Terraform, Airflow, Dataflow, and GCP architecture. Python: Proficiency in Python for automation and data engineering tasks. Cloud Architecture: Solid understanding of GCP architecture principles and best practices. Strong problem-solving skills and ability to work in a dynamic, fast-paced environment.

Posted 3 months ago

Apply

7.0 - 12.0 years

19 - 22 Lacs

Chennai

Work from Office

Key Responsibilities: Design and implement scalable and efficient full-stack solutions using Java and cloud technologies. Develop and maintain cloud-based solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Astronomer, Terraform, Airflow, and Dataflow. Architect and implement complex data engineering solutions using GCP services. Collaborate with cross-functional teams to develop, deploy, and optimize cloud-based applications. Utilize Python for data engineering and automation tasks within the cloud environment. Ensure alignment with GCP architecture best practices and contribute to the design of high-performance systems. Lead and mentor junior developers, fostering a culture of learning and continuous improvement. Required Skills: Full-Stack Development (7+ years): Strong expertise in full-stack Java development with experience in building and maintaining complex web applications. Google Cloud Platform (GCP): Hands-on experience with GCP services like BigQuery, Astronomer, Terraform, Airflow, Dataflow, and GCP architecture. Python: Proficiency in Python for automation and data engineering tasks. Cloud Architecture: Solid understanding of GCP architecture principles and best practices. Strong problem-solving skills and ability to work in a dynamic, fast-paced environment. Mandatory Key Skills Big Query,Astronomer,Terraform,Airflow,Cloud Architecture,Java,Google Cloud Platform*,Python*

Posted 3 months ago

Apply

5.0 - 7.0 years

30 - 40 Lacs

Bengaluru

Hybrid

Senior Software Developer (Python) Experience: 5 - 7 Years Exp Salary : Upto USD 40,000 / year Preferred Notice Period : Within 60 Days Shift : 11:00AM to 8:00PM IST Opportunity Type: Hybrid (Bengaluru) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Apache Airflow, Astronomer, Pandas/Pyspark/Dask, RESTful API, Snowflake, Docker, Python, SQL Good to have skills : CI/CD, Data Vizualization, Matplotlib, Prometheus, AWS, Kubernetes A Single Platform for Loans/Securities & Finance (One of Uplers' Clients) is Looking for: Senior Software Developer (Python) who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Job Summary We are seeking a highly skilled Senior Python Developer with expertise in large-scale data processing and Apache Airflow. The ideal candidate will be responsible for designing, developing, and maintaining scalable data applications and optimizing data pipelines. You will be an integral part of our R&D and Technical Operations team, focusing on data engineering, workflow automation, and advanced analytics. Key Responsibilities Design and develop sophisticated Python applications for processing and analyzing large datasets. Implement efficient and scalable data pipelines using Apache Airflow and Astronomer. ¢ Create, optimize, and maintain Airflow DAGs for complex workflow orchestration. ¢ Work with data scientists to implement and scale machine learning models. ¢ Develop robust APIs and integrate various data sources and systems. ¢ Optimize application performance for handling petabyte-scale data operations. ¢ Debug, troubleshoot, and enhance existing Python applications. ¢ Write clean, maintainable, and well-tested code following best practices. ¢ Participate in code reviews and mentor junior developers. ¢ Collaborate with cross-functional teams to translate business requirements into technical solutions. Required Skills & Qualifications ¢ Strong programming skills in Python with 5+ years of hands-on experience. ¢ Proven experience working with large-scale data processing frameworks (e.g., Pandas, PySpark, Dask). ¢ Extensive hands-on experience with Apache Airflow for workflow orchestration. ¢ Experience with Astronomer platform for Airflow deployment and management. ¢ Proficiency in SQL and experience with Snowflake database. ¢ Expertise in designing and implementing RESTful APIs. ¢ Basic knowledge of Java programming. ¢ Experience with containerization technologies (Docker). ¢ Strong problem-solving skills and the ability to work independently. Preferred Skills ¢ Experience with cloud platforms (AWS). ¢ Knowledge of CI/CD pipelines and DevOps practices. ¢ Familiarity with Kubernetes for container orchestration. ¢ Experience with data visualization libraries (Matplotlib, Seaborn, Plotly). ¢ Background in financial services or experience with financial data. ¢ Proficiency in monitoring tools like Prometheus, Grafana, and ELK stack. Engagement Type: Fulltime Direct-hire on Riskspan Payroll Job Type: Permanent Location: Hybrid (Bangalore Working time: 11:00 AM to 8:00 PM Interview Process - 3- 4 Rounds How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: RiskSpan uncovers insights and mitigates risk for mortgage loans and structured products. The Edge Platform provides data and predictive models to run forecasts under a range of scenarios and analyze Agency and non-Agency MBS, loans, and MSRs. Leverage our bleeding-edge cloud, machine learning, and AI capabilities to scale faster, optimize model builds, and manage information more efficiently. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies