Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Qualification OLAP, Data Engineering, Data warehousing, ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake Experience in writing and troubleshooting SQL programming or MDX queries Experience of working on Linux Experience in Microsoft Analysis services (SSAS) or OLAP tools Tableau or Micro strategy or any BI tools Expertise of programming in Python, Java or Shell Script would be a plus Role Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for prospects regarding technical issues during POV stage. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience 3 to 6 years Job Reference Number 10350
Posted 5 days ago
5.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Qualification Required Proven hands-on experience on designing, developing and supporting Database projects for analysis in a demanding environment. Proficient in database design techniques – relational and dimension designs Experience and a strong understanding of business analysis techniques used. High proficiency in the use of SQL or MDX queries. Ability to manage multiple maintenance, enhancement and project related tasks. Ability to work independently on multiple assignments and to work collaboratively within a team is required. Strong communication skills with both internal team members and external business stakeholders Added Advanatage Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake will be an added advantage. Experience of working on Linux system Experience of Tableau or Micro strategy or Power BI or any BI tools will be an added advantage. Expertise of programming in Python, Java or Shell Script would be a plus Role Roles & Responsibilities Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for customers regarding technical issues during the project. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience 5 to 10 years Job Reference Number 11078
Posted 5 days ago
3.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Qualification Pre-Sales Solution Engineer - India Experience Areas Or Skills Pre-Sales experience of Software or analytics products Excellent verbal & written communication skills OLAP tools or Microsoft Analysis services (MSAS) Data engineering or Data warehouse or ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Tableau or Micro strategy or any BI tool Hive QL or Spark SQL or PLSQL or TSQL Writing and troubleshooting SQL programming or MDX queries Working on Linux programming in Python, Java or Java Script would be a plus Filling RFP or Questioner from Customer NDA, Success Criteria, Project closure and other Documentation Be willing to travel or relocate as per requirement Role Acts as main point of contact for Customer contacts involved in the evaluation process Product demonstrations to qualified leads Product demonstrations in support of marketing activity such as events or webinars Own RFP, NDA, PoC success criteria document, POC Closure and other documents Secures alignment on Process and documents with the customer / prospect Owns the technical win phases of all active opportunities Understand Customer domain and database schema Providing OLAP and Reporting solution Work closely with customers for understanding and resolving environment or OLAP cube or reporting related issues Co-ordinate with solutioning team for execution of PoC as per success plan Creates enhancement requests or identify requests for new features on behalf of customers or hot prospects Experience 3 to 6 years Job Reference Number 10771
Posted 5 days ago
3.0 years
15 - 20 Lacs
Madurai, Tamil Nadu
On-site
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Engineer Location: Madurai Experience: 5+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Engineer, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 5+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person
Posted 5 days ago
10.0 years
0 Lacs
Dehradun, Uttarakhand, India
On-site
Key Responsibilities - Familiarity with modern storage formats like Parquet and ORC. Design and develop conceptual, logical, and physical data models to support enterprise data initiatives. Build, maintain, and optimize data models within Databricks Unity Catalog. Develop efficient data structures using Delta Lake, optimizing for performance, scalability, and reusability. Collaborate with data engineers, architects, analysts, and stakeholders to ensure data model alignment with ingestion pipelines and business goals. Translate business and reporting requirements into robust data architecture using best practices in data warehousing and Lakehouse design. Maintain comprehensive metadata artifacts including data dictionaries, data lineage, and modeling documentation. Enforce and support data governance, data quality, and security protocols across data ecosystems. Continuously evaluate and improve modeling processes and Skills and Experience : 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficient in modeling methodologies including Kimball, Inmon, and Data Vault. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart. Proven experience in Databricks with Unity Catalog and Delta Lake. Strong command of SQL and Apache Spark for querying and transformation. Hands-on experience with the Azure Data Platform, including : Azure Data Factory Azure Data Lake Storage Azure Synapse Analytics Azure SQL Database Exposure to Azure Purview or similar data cataloging tools. Strong communication and documentation skills, with the ability to work in cross-functional agile Qualifications : Bachelor's or Masters degree in Computer Science, Information Systems, Data Engineering, or related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure. Experience working in agile/scrum environments. Exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) is a plus. (ref:hirist.tech)
Posted 6 days ago
7.0 years
6 - 10 Lacs
Bengaluru
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions - SaT – DnA Associate Manager EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Associate Manager - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your key responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud ( MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills and attributes for success Minimum of 7 years of total experience with 3+ years in Data warehousing/ Business Intelligence field Solid hands-on 3+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas ,data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Minimum 3+ years experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 2+ years hands-on experience in various Azure services preferred – Azure Data Factory,Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics ,Azure Analysis Services & Databricks Minimum of 3 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases ,Oracle/MySQL, Azure SQL and Azure Synapse Strong in PySpark, SparkSQL Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem solving skills Excellent written and verbal communication skills To qualify for the role, you must have Bachelor’s or equivalent degree in computer science, or related field, required. Advanced degree or equivalent business experience preferred Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analyzing large volumes of data Relevant work experience of minimum 6 to 8 years in a big 4 or technology/ consulting set up Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Experience in Snowflake What we look for A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 6 days ago
6.0 years
6 - 10 Lacs
Bengaluru
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions - SaT– DnA Senior Analyst EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Senior Analyst - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your key responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud ( MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills and attributes for success Minimum of 6 years of total experience with 3+ years in Data warehousing/ Business Intelligence field Solid hands-on 3+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas ,data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Minimum 3+ years experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 2+ years hands-on experience in various Azure services preferred – Azure Data Factory,Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics ,Azure Analysis Services & Databricks Minimum of 3 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases ,Oracle/MySQL, Azure SQL and Azure Synapse Strong in PySpark, SparkSQL Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills To qualify for the role, you must have Bachelor’s or equivalent degree in computer science, or related field, required. Advanced degree or equivalent business experience preferred Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analyzing large volumes of data Relevant work experience of minimum 6 to 8 years in a big 4 or technology/ consulting set up Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Experience in Snowflake What we look for A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 6 days ago
5.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) – Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelor’s degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Posted 6 days ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Principal Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks with domain expertise in R&D domain. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Architect and maintain robust, scalable data pipelines using Databricks, Spark, and Delta Lake, enabling efficient batch and real-time processing. Lead efforts to evaluate, adopt, and integrate emerging technologies and tools that enhance productivity, scalability, and data delivery capabilities. Drive performance optimization efforts, including Spark tuning, resource utilization, job scheduling, and query improvements. Identify and implement innovative solutions that streamline data ingestion, transformation, lineage tracking, and platform observability. Build frameworks for metadata-driven data engineering, enabling reusability and consistency across pipelines. Foster a culture of technical excellence, experimentation, and continuous improvement within the data engineering team. Collaborate with platform, architecture, analytics, and governance teams to align platform enhancements with enterprise data strategy. Define and uphold SLOs, monitoring standards, and data quality KPIs for production pipelines and infrastructure. Partner with cross-functional teams to translate business needs into scalable, governed data products. Mentor engineers across the team, promoting knowledge sharing and adoption of modern engineering patterns and tools. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 12 to 17 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status.We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 week ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Principal Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Architect and maintain robust, scalable data pipelines using Databricks, Spark, and Delta Lake, enabling efficient batch and real-time processing. Lead efforts to evaluate, adopt, and integrate emerging technologies and tools that enhance productivity, scalability, and data delivery capabilities. Drive performance optimization efforts, including Spark tuning, resource utilization, job scheduling, and query improvements. Identify and implement innovative solutions that streamline data ingestion, transformation, lineage tracking, and platform observability. Build frameworks for metadata-driven data engineering, enabling reusability and consistency across pipelines. Foster a culture of technical excellence, experimentation, and continuous improvement within the data engineering team. Collaborate with platform, architecture, analytics, and governance teams to align platform enhancements with enterprise data strategy. Define and uphold SLOs, monitoring standards, and data quality KPIs for production pipelines and infrastructure. Partner with cross-functional teams to translate business needs into scalable, governed data products. Mentor engineers across the team, promoting knowledge sharing and adoption of modern engineering patterns and tools. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 12 to 17 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status.We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 week ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal Data Engineer What You Will Do Let’s do this. Let’s change the world. Role Description: We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, demonstrating AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 12 to 17 years of experience in Computer Science, IT or related field of experience Demonstrated proficiency in leveraging cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role Profile LSEG are embarking on a Finance Transformation programme, delivering our Finance Vision and redefining the way we work to bring value and deliver sustainable growth for the business. The Programme shall drive efficiencies and maximise benefits for LSEG by Transforming our abilities of managing Financial Crime via our internal Engineering capabilities. We have an exciting opportunity for a Data Engineer to join our dynamic team within the London Stock Exchange Group, working as part of the Financ ial Crime Engineering team, and supporting LSEG through designing and developing data warehousing solutions and integration of our core systems. The role sits within the Corporate Engineering Delivery function, which provides technology services to Corporate Functions Teams for LSEG and we are looking for someone with demonstrated ability for our Financial Crime Engineering Team. This is a hands-on development position for a candidate with demonstrable record of working with data analysis and database development having excellent development and problem-solving skills, someone with creativity and self-motivation to deliver on critically important projects with timelines and competing priorities and execution excellence (preferably in Financial Services) in supporting enterprise applications across various business units and has good exposure to various database engineering platforms. As a Data Engineer, you will be responsible for: Work with a team of Developers to deliver the product led by data along with the desired functionalities and business outcomes. Managing backlog, designing, developing, delivering and supporting data warehouse changes in an Agile methodology. Ensure data structure and database designs align with application requirements, organization standards, business goals, and scalability needs. Develop ETL pipelines and processes for OLAP systems and data warehouses. Managing build and release activities for data solutions Assisting with architecture artefacts to support product changes. Maintaining and optimizing of data modelling through continuous improvements and enhancements. Working with product owners, architects, business analysts, scrum master and other team members to deliver change on a timely basis Knowledge/Skills Hands on experience in writing advanced SQL Queries. Strong analytical problem-solving skills Hands on development experience with solid skills in designing, developing and deploying complex applications using OLTP and OLAP based databases. Hands on expertise in Physical Data Modeling and DB design with solid skills in performance tuning. Hands on experience in troubleshooting and resolving database performance issues. Hands on experience in building systems using modern, scalable, resilient, cloud native architectures. Experience on AWS is desirable. Hands on experience with multiple databases along with Snowflake and implementing complex stored procedures Good knowledge of data modeling concepts like dimensional modeling and DWH concepts like change data capture (CDC) Hands-on experience in Matillion, Boomi or any other ETL tools Scripting knowledge (e.g. Python, Spark etc.) would be desirable Ability to provide production support for Data Warehouse issues such data load problems, transformation/translation problems etc. Worked in Offshore / Onsite Engagements and collaborated across time zones Should be a great teammate Should be eager to learn new technology and/or functional areas Experience At least 6 years of experience in various databases and ETL tools like Informatica, Matillion Must have ETL E2E experience in documentation, development, testing & deployment to Production Bachelor’s degree in computer science or related field Strong Relational Database background and SQL Skills Proficiency in automation and continuous delivery methods Proficiency in all aspects of the Software Development Life Cycle in an Agile environment Experience leading & managing a development team, preferably in a financial technology or banking MNC. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Posted 1 week ago
12.0 - 17.0 years
7 - 8 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal Data Engineer What you will do Let’s do this. Let’s change the world. Role Description: We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, demonstrating AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 12 to 17 years of experience in Computer Science, IT or related field of experience Demonstrated proficiency in leveraging cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Andhra Pradesh
On-site
ABOUT EVERNORTH: Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Excited to grow your career? This position’s primary responsibility will be to translate software requirements into functions using Mainframe , ETL , Data Engineering with expertise in Databricks and Database technologies. This position offers the opportunity to work on modernizing legacy systems, contribute to cloud infrastructure automation, and support production systems in a fast-paced, agile environment. You will work across multiple teams and technologies to ensure reliable, high-performance data solutions that align with business goals. As a Mainframe & ETL Engineer, you will be responsible for the end-to-end development and support of data processing solutions using tools such as Talend, Ab Initio, AWS Glue, and PySpark, with significant work on Databricks and modern cloud data platforms. You will support infrastructure provisioning using Terraform, assist in modernizing legacy systems including mainframe migration, and contribute to performance tuning of complex SQL queries across multiple database platforms including Teradata, Oracle, Postgres, and DB2. You will also be involved in CI/CD practices Responsibilities Support, maintain and participate in the development of software utilizing technologies such as COBOL, DB2, CICS and JCL. Support, maintain and participate in the ETL development of software utilizing technologies such as Talend, Ab-Initio, Python, PySpark using Databricks. Work with Databricks to design and manage scalable data processing solutions. Implement and support data integration workflows across cloud (AWS) and on-premises environments. Support cloud infrastructure deployment and management using Terraform. Participate in the modernization of legacy systems, including mainframe migration. Perform complex SQL queries and performance tuning on large datasets. Contribute to CI/CD pipelines, version control, and infrastructure automation. Provide expertise, tools, and assistance to operations, development, and support teams for critical production issues and maintenance Troubleshoot production issues, diagnose the problem, and implement a solution - First line of defense in finding the root cause Work cross-functionally with the support team, development team and business team to efficiently address customer issues. Active member of high-performance software development and support team in an agile environment Engaged in fostering and improving organizational culture. Qualifications Required Skills: Strong analytical and technical skills. Proficiency in Databricks – including notebook development, Delta Lake, and Spark-based process. Experience with mainframe modernization or migrating legacy systems to modern data platforms. Strong programming skills, particularly in PySpark for data processing. Familiarity with data warehousing concepts and cloud-native architecture. Solid understanding of Terraform for managing infrastructure as code on AWS. Familiarity with CI/CD practices and tools (e.g., Git, Jenkins). Strong SQL knowledge on OLAP DB platforms (Teradata, Snowflake) and OLTP DB platforms (Oracle, DB2, Postgres, SingleStore). Strong experience with Teradata SQL and Utilities Strong experience with Oracle, Postgres and DB2 SQL and Utilities Develop high quality database solutions Ability to do extensive analysis on complex SQL processes and design skills Ability to analyze existing SQL queries for performance improvements Experience in software development phases including design, configuration, testing, debugging, implementation, and support of large-scale, business centric and process-based applications Proven experience working with diverse teams of technical architects, business users and IT areas on all phases of the software development life cycle. Exceptional analytical and problem-solving skills Structured, methodical approach to systems development and troubleshooting Ability to ramp up fast on a system architecture Experience in designing and developing process-based solutions or BPM (business process management) Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong interpersonal/relationship management skills. Strong time and project management skills. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Desire to work in application support space Passion for learning and desire to explore all areas of IT. Required Experience & Education: Minimum of 8-12 years of experience in application development role. Bachelor’s degree equivalent in Information Technology, Business Information Systems, Technology Management, or related field of study. Location & Hours of Work: Hyderabad and Hybrid (13:00 AM IST to 10:00 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 1 week ago
12.0 years
0 Lacs
Delhi Cantonment, Delhi, India
On-site
What Makes Us a Great Place To Work We are proud to be consistently recognized as one of the world’s best places to work. We are currently the #1 ranked consulting firm on Glassdoor’s Best Places to Work list and have maintained a spot in the top four on Glassdoor’s list since its founding in 2009. Extraordinary teams are at the heart of our business strategy, but these don’t happen by chance. They require intentional focus on bringing together a broad set of backgrounds, cultures, experiences, perspectives, and skills in a supportive and inclusive work environment. We hire people with exceptional talent and create an environment in which every individual can thrive professionally and personally. Who You’ll Work With You’ll join our Engineering experts within the AI, Insights & Solutions team. This team is part of Bain’s digital capabilities practice, which includes experts in analytics, engineering, product management, and design. In this multidisciplinary environment, you'll leverage deep technical expertise with business acumen to help clients tackle their most transformative challenges. You’ll work on integrated teams alongside our general consultants and clients to develop data-driven strategies and innovative solutions. Together, we create human-centric solutions that harness the power of data and artificial intelligence to drive competitive advantage for our clients. Our collaborative and supportive work environment fosters creativity and continuous learning, enabling us to consistently deliver exceptional results. We are committed to building a diverse and inclusive team and encourage candidates of all backgrounds to apply. Bain offers comprehensive benefits and flexible policies that are designed to support you, so you can thrive personally and professionally. What You’ll Do As an Expert Senior Manager, Software Engineering (Tech Lead), you will lead the development and application of technical solutions to address complex problems in various industries. You will mentor and guide a diverse engineering team through the entire engineering life cycle. Your responsibilities will include designing, developing, optimizing, and deploying cutting-edge data engineering solutions and infrastructure at the production scale required by the world’s largest companies. Collaborate closely with and influence general consulting teams to identify software solutions to client business problems, and to appropriately scope, prioritize and execute those solutions Overall technical leader responsible for end-to-end technical solution delivery on client cases (from solution architecture to hands-on development work) Lead the entire software development life cycle, including architecture design, writing clean code, conducting code reviews, writing documentation, unit/integration tests, identifying issues and resolutions Participate in expert client advisory activities that require deep expertise software engineering with distributed systems and application architecture Collaborate on (or lead) the development of re-usable common frameworks, model and components that can be highly leveraged to address common software engineering problems across industries and business functions Work with the team and other senior leaders to create a great working environment that attracts other great engineers Coach engineering teams at our clients and partners to raise their capabilities and ensure that our work is successfully deployed to the highest standards Drive best demonstrated practices in software engineering, and share learnings with team members in AAG about theoretical and technical developments in software engineering Drive industry-leading innovations that translate into great impact for our clients in case work Act as PD Advisor as needed Lead recruiting and onboarding for other team members Travel is required (30%) Location: Singapore (Hybrid) Required ABOUT YOU Minimum 12 years of relevant professional hands-on experience in web development, programming languages, version control, software design pattern, infrastructure and deployment, integration and unit testing implementation 3 years minimum experience managing software engineers Master’s degree in computer science, Engineering, or a related technical field Commercial acumen and understanding of business models Experience leading multiple projects independently and growing and developing more junior engineers Technical Skills And Knowledge Track record of shipping production, enterprise scale AI applications and data analytics products. Expert knowledge (5+ years) of Python Deep experience with additional server-side frameworks and technologies such as FastAPI, Node.js, Flask. Experience with Cloud platforms and services (AWS, Azure, GCP, etc.) Experience working in accordance with DevSecOps principles, and familiarity with industry deployment best practices using CI/CD tools, MLOps, LLMOps and infrastructure as code (Jenkins, Docker, Kubernetes, and Terraform) Strong computer science fundaments in data structures, algorithms, automated testing, object-oriented programming, performance complexity, and implications of computer architecture on software performance. Experience with data architecture, database schema design, database scalability and SQL Experience with client-side technologies such as React, Angular, Vue.js, HTML and CSS Hands-on experience in designing and optimizing OLTP systems for real-time processing, as well as building scalable OLAP data pipelines for batch and streaming analytics. Good to have experience developing AI-driven applications and solutions Understanding of data security and privacy regulations, key topics in cybersecurity, authentication and authorization mechanisms (including cloud IAM) Experience working according to agile principles Interpersonal Skills Strong interpersonal and communication skills, including the ability to explain and discuss technicalities of solutions, algorithms and techniques with colleagues and clients from other disciplines Curiosity, proactivity and critical thinking Ability to collaborate with people at all levels and with multi-office/region teams Ability to work independently and juggle priorities to thrive in a fast paced and ambiguous environment, while also collaborating as part of a team in complex situations
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Database Administrator, you will be responsible for maintaining and optimizing MS SQL Server environments. Your duties will include installing, configuring, and upgrading MS SQL Server, SSRS, and IIS, as well as troubleshooting and problem-solving related to databases. You will also be involved in periodic monitoring, performance improvements, and maintenance of databases, including rollout and updates. Your role will also involve coding, monitoring, and tuning database objects for optimal performance, evaluating database patches and service packs, and controlling user privileges and resources to maintain overall database security. Additionally, you will assist in database and resource capacity planning, support project managers and business analysts with reporting requirements, and design and test backup and restore procedures for databases, hardware system fail-over capabilities, and disaster recovery. You will be responsible for creating, maintaining, and updating SQL Server infrastructure documentation, designing and developing enterprise security policies for database systems, and performing problem determination and troubleshooting of technical issues. Furthermore, you will monitor and train support staff on basic MS-SQL commands and procedures, ensure system performance meets specifications through debugging and fine-tuning, and define backup procedures for full database recoveries. Your role will also involve reviewing and developing new stored procedures for developers, providing on-call support for systems and associated software products on a 24/7 rotation basis, and ensuring that backup procedures are well-defined and documented. Your expertise in SQL Server tools, database design practices, and high availability strategies will be essential in maintaining the efficiency and security of database systems.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Evernorth Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Excited to grow your career? This position’s primary responsibility will be to translate software requirements into functions using Mainframe , ETL , Data Engineering with expertise in Databricks and Database technologies. This position offers the opportunity to work on modernizing legacy systems, contribute to cloud infrastructure automation, and support production systems in a fast-paced, agile environment. You will work across multiple teams and technologies to ensure reliable, high-performance data solutions that align with business goals. As a Mainframe & ETL Engineer, you will be responsible for the end-to-end development and support of data processing solutions using tools such as Talend, Ab Initio, AWS Glue, and PySpark, with significant work on Databricks and modern cloud data platforms. You will support infrastructure provisioning using Terraform, assist in modernizing legacy systems including mainframe migration, and contribute to performance tuning of complex SQL queries across multiple database platforms including Teradata, Oracle, Postgres, and DB2. You will also be involved in CI/CD practices Responsibilities Support, maintain and participate in the development of software utilizing technologies such as COBOL, DB2, CICS and JCL. Support, maintain and participate in the ETL development of software utilizing technologies such as Talend, Ab-Initio, Python, PySpark using Databricks. Work with Databricks to design and manage scalable data processing solutions. Implement and support data integration workflows across cloud (AWS) and on-premises environments. Support cloud infrastructure deployment and management using Terraform. Participate in the modernization of legacy systems, including mainframe migration. Perform complex SQL queries and performance tuning on large datasets. Contribute to CI/CD pipelines, version control, and infrastructure automation. Provide expertise, tools, and assistance to operations, development, and support teams for critical production issues and maintenance Troubleshoot production issues, diagnose the problem, and implement a solution - First line of defense in finding the root cause Work cross-functionally with the support team, development team and business team to efficiently address customer issues. Active member of high-performance software development and support team in an agile environment Engaged in fostering and improving organizational culture. Qualifications Required Skills: Strong analytical and technical skills. Proficiency in Databricks – including notebook development, Delta Lake, and Spark-based process. Experience with mainframe modernization or migrating legacy systems to modern data platforms. Strong programming skills, particularly in PySpark for data processing. Familiarity with data warehousing concepts and cloud-native architecture. Solid understanding of Terraform for managing infrastructure as code on AWS. Familiarity with CI/CD practices and tools (e.g., Git, Jenkins). Strong SQL knowledge on OLAP DB platforms (Teradata, Snowflake) and OLTP DB platforms (Oracle, DB2, Postgres, SingleStore). Strong experience with Teradata SQL and Utilities Strong experience with Oracle, Postgres and DB2 SQL and Utilities Develop high quality database solutions Ability to do extensive analysis on complex SQL processes and design skills Ability to analyze existing SQL queries for performance improvements Experience in software development phases including design, configuration, testing, debugging, implementation, and support of large-scale, business centric and process-based applications Proven experience working with diverse teams of technical architects, business users and IT areas on all phases of the software development life cycle. Exceptional analytical and problem-solving skills Structured, methodical approach to systems development and troubleshooting Ability to ramp up fast on a system architecture Experience in designing and developing process-based solutions or BPM (business process management) Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong interpersonal/relationship management skills. Strong time and project management skills. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Desire to work in application support space Passion for learning and desire to explore all areas of IT. Required Experience & Education Minimum of 8-12 years of experience in application development role. Bachelor’s degree equivalent in Information Technology, Business Information Systems, Technology Management, or related field of study. Location & Hours of Work: Hyderabad and Hybrid (13:00 AM IST to 10:00 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
About Us Observe.AI is transforming customer service with AI agents that speak, think, and act like your best human agents helping enterprises automate routine customer calls and workflows, support agents in real time, and uncover powerful insights from every interaction. With Observe.AI , businesses boost automation, deliver faster, more consistent 24/7 service and build stronger customer loyalty. Trusted by brands like Accolade, Prudential, Concentrix, Cox Automotive, and Included Health, Observe.AI is redefining how businesses connect with customers driving better experiences and lasting relationships at every touchpoint. The Opportunity We are looking for a Senior Data Engineer with strong hands-on experience in building scalable data pipelines and real-time processing systems. You will be part of a high-impact team focused on modernizing our data architecture, enabling self-serve analytics, and delivering high-quality data products. This role is ideal for engineers who love solving complex data challenges, have a growth mindset, and are excited to work on both batch and streaming systems. What you ll be doing: Build and maintain real-time and batch data pipelines using tools like Kafka, Spark, and Airflow. Contribute to the development of a scalable LakeHouse architecture using modern data formats such as Delta Lake, Hudi, or Iceberg. Optimize data ingestion and transformation workflows across cloud platforms (AWS, GCP, or Azure). Collaborate with Analytics and Product teams to deliver data models, marts, and dashboards that drive business insights. Support data quality, lineage, and observability using modern practices and tools. Participate in Agile processes (Sprint Planning, Reviews) and contribute to team knowledge sharing and documentation. Contribute to building data products for inbound (ingestion) and outbound (consumption) use cases across the organization. Who you are: 5-8 years of experience in data engineering or backend systems with a focus on large-scale data pipelines. Hands-on experience with streaming platforms (e.g., Kafka) and distributed processing tools (e.g., Spark or Flink). Working knowledge of LakeHouse formats (Delta/Hudi/Iceberg) and columnar storage like Parquet. Proficient in building pipelines on AWS, GCP, or Azure using managed services and cloud-native tools. Experience in Airflow or similar orchestration platforms. Strong in data modeling and optimizing data warehouses like Redshift, BigQuery, or Snowflake. Exposure to real-time OLAP tools like ClickHouse, Druid, or Pinot. Familiarity with observability tools such as Grafana, Prometheus, or Loki. Some experience integrating data with MLOps tools like MLflow, SageMaker, or Kubeflow. Ability to work with Agile practices using JIRA, Confluence, and participating in engineering ceremonies. Compensation, Benefits and Perks Excellent medical insurance options and free online doctor consultations Yearly privilege and sick leaves as per Karnataka S&E Act Generous holidays (National and Festive) recognition and parental leave policies Learning & Development fund to support your continuous learning journey and professional development Fun events to build culture across the organization Flexible benefit plans for tax exemptions (i.e. Meal card, PF, etc.) Our Commitment to Inclusion and Belonging Observe.AI is an Equal Employment Opportunity employer that proudly pursues and hires a diverse workforce. Observe AI does not make hiring or employment decisions on the basis of race, color, religion or religious belief, ethnic or national origin, nationality, sex, gender, gender identity, sexual orientation, disability, age, military or veteran status, or any other basis protected by applicable local, state, or federal laws or prohibited by Company policy. Observe.AI also strives for a healthy and safe workplace and strictly prohibits harassment of any kind. We welcome all people. We celebrate diversity of all kinds and are committed to creating an inclusive culture built on a foundation of respect for all individuals. We seek to hire, develop, and retain talented people from all backgrounds. Individuals from non-traditional backgrounds, historically marginalized or underrepresented groups are strongly encouraged to apply. If you are ambitious, make an impact wherever you go, and youre ready to shape the future of Observe.AI, we encourage you to apply. For more information, visit www.observe.ai .
Posted 1 week ago
1.0 - 8.0 years
3 - 10 Lacs
Bengaluru
Work from Office
Primary Responsibilities F5xc SRE: Play the role of a hands-on SRE Engineer focused on automation and toil-reduction and participate in Ops cycles to support our product. Perform oncall support function on a rotation basis, providing timely resolution of issues and ensuring operational excellence in managing and maintaining distributed networking and security products Easy-to-Use Automation: Continue to grow the infra-automation (k8s, ArgoCD, Helm Charts, Golang services, AWS, GCP, Terraform) with a focus on ease of configuration Environment Stability using Observability: Create and continue to evolve existing Observability (metrics & alerts) and participate in regular monitoring of infrastructure for stability. Collaborative Engagement: Collaborate closely with application owners and SRE team members as part of roadmap execution and continuous improvement of existing systems. Scale & Resilient systems: Design & deploy systems/infra which is highly available and resilient for the configured failure domains. Design systems using strong security principles with security by default. Knowledge, Skills and Abilities Elasticsearch : Deep understanding of indexing strategies, query optimization, cluster management, and tuning for high-throughput use cases. Familiarity with slow query analysis, scaling, and shard management. ClickHouse : Proven experience in designing and managing OLAP workloads, optimizing query performance, and implementing efficient table engines and materialized views. Apache Kafka : Expertise in event streaming architecture, topic design, producer/consumer configuration, and handling high-volume, low-latency data pipelines. Experience with Kafka Connect and Schema Registry is a plus. Vector (Datadog/Timber.io/Logs) : Proficiency in configuring Vector for observability pipelines, including log transformation, enrichment, and routing to multiple sinks (e.g., Elasticsearch, S3, ClickHouse). Hands-on experience with the Cortex suite of observability tools, including Cortex, Loki, Tempo, and Prometheus integration for scalable, multi-tenant monitoring systems. Familiar with integrating Cortex/Mimir with Grafana dashboards, Thanos, or Prometheus Remote Write to support observability-as-a-service use cases . Hands-on programming experience in any one language python,golang + shell scripting. Strong networking fundamentals and experience dealing with different layers of the networking stack. SRE/Devops on Linux & Kubernetes: Demonstrate excellent, hands-on knowledge of deploying workloads and managing lifecyle on kubernetes, with practical experience on debugging issues. Experience in upgrading workloads for SaaS Services without downtime. Oncall Experience in managing everyday OPs for production environments. Experience in production alerts management and using dashboards to debug issues. GipOps: Experience with helm charts/kustomizations and gitops tools like ArgoCD/FluxCD. CI/CD: Experience working with/designing functional CI/CD systems. Cloud Infrastructure: Prior experience in deploying workloads and managing lifecycle on any cloud provider (AWS/GCP/Azure) Equal Employment Opportunity .
Posted 1 week ago
12.0 years
0 Lacs
Gurugram, Haryana, India
On-site
WHAT MAKES US A GREAT PLACE TO WORK We are proud to be consistently recognized as one of the world’s best places to work. We are currently the #1 ranked consulting firm on Glassdoor’s Best Places to Work list and have maintained a spot in the top four on Glassdoor’s list since its founding in 2009. Extraordinary teams are at the heart of our business strategy, but these don’t happen by chance. They require intentional focus on bringing together a broad set of backgrounds, cultures, experiences, perspectives, and skills in a supportive and inclusive work environment. We hire people with exceptional talent and create an environment in which every individual can thrive professionally and personally. WHO YOU’LL WORK WITH You’ll join our Engineering experts within the AI, Insights & Solutions team. This team is part of Bain’s digital capabilities practice, which includes experts in analytics, engineering, product management, and design. In this multidisciplinary environment, you'll leverage deep technical expertise with business acumen to help clients tackle their most transformative challenges. You’ll work on integrated teams alongside our general consultants and clients to develop data-driven strategies and innovative solutions. Together, we create human-centric solutions that harness the power of data and artificial intelligence to drive competitive advantage for our clients. Our collaborative and supportive work environment fosters creativity and continuous learning, enabling us to consistently deliver exceptional results. We are committed to building a diverse and inclusive team and encourage candidates of all backgrounds to apply. Bain offers comprehensive benefits and flexible policies that are designed to support you, so you can thrive personally and professionally. WHAT YOU’LL DO As an Expert Senior Manager, Software Engineering (Tech Lead), you will lead the development and application of technical solutions to address complex problems in various industries. You will mentor and guide a diverse engineering team through the entire engineering life cycle. Your responsibilities will include designing, developing, optimizing, and deploying cutting-edge data engineering solutions and infrastructure at the production scale required by the world’s largest companies. • Collaborate closely with and influence general consulting teams to identify software solutions to client business problems, and to appropriately scope, prioritize and execute those solutions • Overall technical leader responsible for end-to-end technical solution delivery on client cases (from solution architecture to hands-on development work) • Lead the entire software development life cycle, including architecture design, writing clean code, conducting code reviews, writing documentation, unit/integration tests, identifying issues and resolutions • Participate in expert client advisory activities that require deep expertise software engineering with distributed systems and application architecture • Collaborate on (or lead) the development of re-usable common frameworks, model and components that can be highly leveraged to address common software engineering problems across industries and business functions • Work with the team and other senior leaders to create a great working environment that attracts other great engineers • Coach engineering teams at our clients and partners to raise their capabilities and ensure that our work is successfully deployed to the highest standards • Drive best demonstrated practices in software engineering, and share learnings with team members in AAG about theoretical and technical developments in software engineering • Drive industry-leading innovations that translate into great impact for our clients in case work • Act as PD Advisor as needed • Lead recruiting and onboarding for other team members Travel is required (30%) Location: Singapore (Hybrid) ABOUT YOU Required • Minimum 12 years of relevant professional hands-on experience in web development, programming languages, version control, software design pattern, infrastructure and deployment, integration and unit testing implementation • 3 years minimum experience managing software engineers • Master’s degree in computer science, Engineering, or a related technical field • Commercial acumen and understanding of business models • Experience leading multiple projects independently and growing and developing more junior engineers Technical Skills and Knowledge: • Track record of shipping production, enterprise scale AI applications and data analytics products. • Expert knowledge (5+ years) of Python • Deep experience with additional server-side frameworks and technologies such as FastAPI, Node.js, Flask. • Experience with Cloud platforms and services (AWS, Azure, GCP, etc.) • Experience working in accordance with DevSecOps principles, and familiarity with industry deployment best practices using CI/CD tools, MLOps, LLMOps and infrastructure as code (Jenkins, Docker, Kubernetes, and Terraform) • Strong computer science fundaments in data structures, algorithms, automated testing, object-oriented programming, performance complexity, and implications of computer architecture on software performance. • Experience with data architecture, database schema design, database scalability and SQL • Experience with client-side technologies such as React, Angular, Vue.js, HTML and CSS • Hands-on experience in designing and optimizing OLTP systems for real-time processing, as well as building scalable OLAP data pipelines for batch and streaming analytics. • Good to have experience developing AI-driven applications and solutions • Understanding of data security and privacy regulations, key topics in cybersecurity, authentication and authorization mechanisms (including cloud IAM) • Experience working according to agile principles Interpersonal Skills: • Strong interpersonal and communication skills, including the ability to explain and discuss technicalities of solutions, algorithms and techniques with colleagues and clients from other disciplines • Curiosity, proactivity and critical thinking • Ability to collaborate with people at all levels and with multi-office/region teams • Ability to work independently and juggle priorities to thrive in a fast paced and ambiguous environment, while also collaborating as part of a team in complex situations
Posted 1 week ago
2.0 - 6.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism SAP Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Airflow, Kinesis, Redshift Role and responsibilities Create Service Offerings Create Customer Presentation (Pitch Deck based on Service Offerings) Help in winning the new deals and to manage the same Help in taking the offerings to market along with the Sales team Help in account mining /farming to grow the customer accounts Ensure the services are delivered as per contractual agreement with the Customer Responsibilities Architecting and overall endtoend design, deployment, and delivery of Azure Data Platforms, across Data Lakes, Data Warehouses, Data Lake houses, pipelines, Databricks, BI and Data Analytics solutions Remaining up to date in new and emerging technologies Working with clients to develop data technology strategy and roadmaps, and plan delivery Oversight and support of delivery team outputs Data modelling, design, and build Infrastructure as Code delivery Enforcing technical architecture and documentation standards, policies, and procedures Analysing, implementing, and resolving issues with existing Azure Data Platforms Working with business experts and customers to understand business needs, and translate business requirements into reporting and data analytics functionality Assisting in scoping, estimation, and task planning for assigned projects Following the project work plans to meet functionality requirements, project objectives and timelines Providing accurate and complete technical architectural documents Addressing customer queries and issues in a timely manner Providing mentoring and hands on guidance to other team members Experience in designing and implementing Azure Data solutions using services such as o Azure Synapse Analytics o Azure Databricks o Azure Data Lake Storage Gen2 o Azure SQL Database o Azure Data Factory o Azure DevOps o Azure Stream Analytics o Azure Blob storage o Azure Cosmos DB o ARM templates Familiar with Microsoft Power BI Familiar with Azure Purview An understanding of Master Data Management and Data Governance frameworks Familiar with Infrastructure as Code approaches and implementations Familiar with development approaches such as CI/CD Familiar with Azure DevOps Strong communication and collaboration skills Strong analytical thinking and problemsolving skills Ability to work as a team member and leader in a diverse technical environment Be customerservice oriented Be able to work in a fastpaced, changing environment. Proficient in spoken and written English Willing to travel abroad when required Graduatelevel education in Computer Science or a relevant field, or a widely recognised professional qualification at a comparable level Formal training and/or certification on related technologies is highly valued Minimum of three years working in a similar role Knowledge of Common Data Models/Industry Data Models/Synapse Analytics Database templates will be considered an asset Experience in OLAP technology and the Microsoft onpremises BI Stack (SSIS/SSRS/SSAS) will be useful but is not compulsory The role is highly technical and requires a robust understanding and handson expertise of Microsoft Azure cloud technologies, data architecture and modelling concepts. The role also demands strong analytical, problem solving, and planning skills. The role requires strong communication and collaboration skills and the motivation to achieve results in a dynamic business environment. Required Technical Skill set SQL, Azure Data Factory+ Azure Data Lake + Azure SQL+ Azure Synapse+ Mandatory Skills sets Data factory, Databricks, SQL DB, Python, ADLS, PySpark Preferred Skills sets PBI, Power APP stack, Data Modelling , AWS lambda, Glue, GMR, Airflow, Kinesis, Redshift Years of Experience required 610 Education Qualifications Bachelors degree in Computer Science, Engineering, or a related field. Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?
Posted 1 week ago
4.0 - 9.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Company Overview: Syren Cloud Inc. is a leading Data Engineering company that specializes in solving complex challenges in the Supply Chain Management industry. We have a team of over 350 employees and a robust revenue of $25M+. Our mission is to empower organizations with cutting-edge software engineering solutions that optimize operations, harness supply chain intelligence, and drive sustainable growth. We value both growth and employee well-being, striving to maintain a positive work environment while providing opportunities for professional development. Role Summary: As a Business Intelligence Engineer, you ll create and deliver data analytics solutions using tools like Power BI. You ll collaborate with business partners to define requirements, design data models, and build dashboards to provide actionable insights. Your work will help automate the digital supply chain for a global audience. Key Responsibilities - Develop dashboards and reports using Power BI. - Design data models to transform raw data into insights. - Work with MS SQL Server BI stack (SSRS, TSQL, Power Query, MDX, DAX). - Collaborate with end users to gather and translate requirements. - Enhance existing BI systems and ensure data security. Qualifications - 4+ years of Power BI experience. - Proficiency in Power BI, SQL Server, TSQL, Power Query, MDX, and DAX. - Strong analytical and communication skills. - Knowledge of database management systems and OLAP/OLTP. - Comfortable working under deadlines in agile environments.
Posted 1 week ago
4.0 - 9.0 years
15 - 25 Lacs
Kolkata
Work from Office
Inviting applications for the role of Senior Principal Consultant-Power BI Developer! Responsibilities: • Working within a team to identify, design and implement a reporting/dashboarding user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices Gathering query data from tables of industry cognitive model/data lake and build data models with BI tools Apply requisite business logic using data transformation and DAX Understanding on Power BI Data Modelling and various in-built functions Knowledge on reporting sharing through Workspace/APP, Access management, Dataset Scheduling and Enterprise Gateway • Understanding of static and dynamic row level security Ability to create wireframes based on user stories and Business requirement Basic Understanding on ETL and Data Warehousing concepts Conceptualizing and developing industry specific insights in forms dashboards/reports/analytical web application to deliver of Pilots/Solutions following best practices Qualifications we seek in you! Minimum Qualifications Graduate Proficient in Power BI report development and data modeling. • Strong analytical skills and ability to work independently. • Experience in developing and implementing solutions in Power BI. 1 • Expertise in creating data models for report development in Power BI. • Strong SQL skills and ability to interpret data. • Proficient in overall testing of code and functionality. • Optional: Knowledge of Snowflake. • Preferred: Experience in finance projects/ Financial System Knowledge
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Global Finance Analyst Power BI – Analysis & Insight Lloyd’s Register Location: - Mumbai, India What We’re Looking For Convert financial data into informative visual reports and dashboards that help inform decision making What We Offer You The opportunity to work for an organization that has a strong sense of purpose, is value driven and helps colleagues to develop professionally and personally through our range of people development programmes. A Full-time permanent role. The role Build automated reports and dashboards with the help of Power BI and other reporting tools. Extract data from various sources to transform raw data into meaningful insights to support Senior leadership teams, Executive Leadership Teams and the FP&A leads. Develop models/reports, delivering the desired data visualisation and Business analytics results to support decision making. Support FP&A ad hoc analysis What You Bring Qualified accountant (ACA or CIMA) and currently operating at a senior finance level in a global organisation Able to perform at the highest levels whilst also demonstrating the ability to be hands on when required. The appointee will measure their success by results and will have the resilience and maturity to manage internal relationships in an organisation going through rapid change. Experience of international multi-site and multi-currency organisations Experience in handling data preparation – collection (from various sources), organising, cleaning data to extract valuable Insights. Data modelling experience and understanding of different technologies such as OLAP, statistical analysis, computer science algorithms, databases etc Knowledge & Experience working with Business Intelligence tools and systems like SAP, Power BI, Tableau, etc. preferably complimented by associated skills such as SQL, Power Query, DAX, Python, R etc. Experience of international multi-site commercial/operational activity Ability to drill down and visualize data in the best possible way using charts, reports, or dashboards generated using Power BI Ability to understand and assess complex and sometimes unfamiliar situations, visualise solutions and see through to resolution and work effectively within a matrix organisation. Ability to work successfully within a Finance Shared Service Centre mode Good attention to detail with the keen eye for errors and flaws in the data to help LR work with the cleanest most accurate data. Strong communication skills You Are Someone Who Is keen to take accountability and ownership for delivering customer needs Can self-manage and prioritize tasks towards achieving goals. Is effective at solving problems, troubleshooting and making timely decisions Is flexible and eager to take initiatives. Communicates in a structured way and has ability to present technical ideas in user-friendly language. Displays a team spirit, particularly in a multicultural environment. Responds positively to learning opportunities and is comfortable stepping out of own comfort zone.
Posted 1 week ago
5.0 years
3 - 7 Lacs
Hyderābād
On-site
Job Title: Databricks Developer / Data Engineer Duration - 12 Months with Possible Extension Location: Hyderabad, Telangana (Hybrid) 1-2 days onsite at client location Job Summary: We are seeking a highly skilled Databricks Developer / Data Engineer with 5+ years of experience in building scalable data pipelines, managing large datasets, and optimizing data workflows in cloud environments. The ideal candidate will have hands-on expertise in Azure Databricks, Azure Data Factory, and other Azure-native services, playing a key role in enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion, transformation, and integration Work with both structured and unstructured data from a variety of internal and external sources Collaborate with data analysts, scientists, and engineers to ensure data quality, integrity, and availability Build and manage data lakes, data warehouses, and data models (Azure Databricks, Azure Data Factory, Snowflake, etc.) Optimize performance of large-scale batch and real-time processing systems Implement data governance , metadata management, and data lineage practices Monitor and troubleshoot pipeline issues; perform root cause analysis and proactive resolution Automate data validation and quality checks Ensure compliance with data privacy, security, and regulatory requirements Maintain thorough documentation of architecture, data workflows, and processes Mandatory Qualifications: 5+ years of hands-on experience with: Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database Azure Logic Apps, Azure Data Factory, Azure Databricks, Azure ML Azure DevOps Services, Azure API Management, Webhooks Intermediate-level proficiency in Python scripting and PySpark Basic understanding of Power BI and visualization functionalities Technical Skills & Experience Required: Proficient in SQL and working with both relational and non-relational databases (e.g., SQL, PostgreSQL, MongoDB, Cassandra) Hands-on experience with Apache Spark, Hadoop, Hive for big data processing Proficiency in building scalable data pipelines using Azure Data Factory and Azure Databricks Solid knowledge of cloud-native tools : Delta Lake, Azure ML, Azure DevOps Understanding of data modeling , OLAP/OLTP systems , and data warehousing best practices Experience with CI/CD pipelines , version control with Git , and working with Azure Repos Knowledge of data security , privacy policies, and compliance frameworks Excellent problem-solving , troubleshooting , and analytical skills
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough