Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Hyderabad, Chennai
Work from Office
Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. Key Responsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes Required Skills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities Nice To Have Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) Location Options: Hyderabad / Chennai (Remote flexibility available) Apply to: navaneeta@suzva.com Contact: 9032956160
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction n this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Show more Show less
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Sapiens is on the lookout for a Developer (ETL) to become a key player in our Bangalore team. If you're a seasoned ETL pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens’ L&P division, for more information about it, click here: https://sapiens.com/solutions/life-and-pension-software/ What You’ll Do Designing, and developing of core components/services that are flexible, extensible, multi-tier, scalable, high-performance and reliable applications of an advanced complex software system, called ALIS both in R&D and Delivery. Good understanding in the ETL Advanced concepts and administration activities to support R&D/Project. Experience in understanding of different ETL tools (min 4) and Advanced transformations, Good in Talend, SAP BODS to support R&D/Project To be able to resolve all ETL code and administration issue. Ability to resolve complex Reporting challenges. Ability to create full-fledged dashboards with story boards/lines, drill down, linking etc.; design tables, views or datamarts to support these dashboards Ability understand, propose data load strategies which improves performance and visualizations Ability to performance tune SQL, ETL & Report, Universes To understand Sapiens Intelligence Product and support below points Understands the Transaction Layer model for all modules Universe the Universe Model for all modules Should have End to End Sapiens Intelligence knowledge Should be able to independently demo or give training for Sapiens Intelligence product. Should be an SME in Sapiens Intelligence as a Product What To Have For This Position. Must have Skills. 3 - 5 years of IT experience. Should have experience to understand the Advanced concepts of Insurance and has good command over at least all Business / Functional Areas (like NB, claims, Finance etc,.) Should have experience with developing a complete DWH ETL lifecycle Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. - using ETL tools such as Talend, BODS, SSIS etc. Experience or knowledge in Bigdata related tools like (Spark, Hive, Kafka, Hadoop, Horton works, Python, R) would be good to go Should have experience in developing SAP BO or any Reporting tool knowledge Should be able to implement reusability, parameterization, workflow design, etc. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and Low/High level design documents Experience in understanding complex source system data structures – preferably in Insurance services (Insurance preferred) Experience in Data Analysis, Data Modeling and Data Mart design Strong database development skills like complex SQL queries, complex stored procedures Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Ability to work with minimal guidance or supervision in a time critical environment. Willingness to travel and work at various customer sites across the globe. About Sapiens Sapiens is a global leader in the insurance industry, delivering its award-winning, cloud-based SaaS insurance platform to over 600 customers in more than 30 countries. Sapiens’ platform offers pre-integrated, low-code capabilities to accelerate customers’ digital transformation. With more than 40 years of industry expertise, Sapiens has a highly professional team of over 5,000 employees globally. For More information visit us on www.sapiens.com . Sapiens is an equal opportunity employer. We value diversity and strive to create an inclusive work environment that embraces individuals from diverse backgrounds. Disclaimer : Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com . Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
India
Remote
Job Summary We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: AWS, Python, SQL, Airflow, DBT Must have done 1 or 2 projects in Clinical Domain/Clinical Industry. Responsibilities Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Skills: aws,talend,python,snowflake,dbt,databricks,mongodb,postgresql,airflow,data engineering,etl,gcp,bigquery,sql,redshift,cassandra,azure,informatica Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Udaipur, Rajasthan, India
Remote
At GKM IT , we’re looking for a DevOps Engineer - Senior II who thrives at the intersection of strategy and execution. If you have a passion for building scalable, secure, and high-performing infrastructure, this role is your opportunity to make a direct impact. You’ll play a key role in designing and optimising systems that support complex, high-load environments—while collaborating with top-tier engineering teams to bring stability, speed, and innovation to everything we deploy. If you're someone who enjoys solving real-world infrastructure challenges and thrives in fast-paced, production-critical setups—we’d love to work with you! Requirements Minimum 5+ years of experience in DevOps roles managing production-grade systems Implement CI/CD pipelines using Jenkins, GitHub Actions, CircleCI, or Azure DevOps Strong expertise in AWS, Terraform, Kubernetes, CI/CD, Linux, and network security Manage, monitor, and optimize distributed databases (PostgreSQL, MySQL, MongoDB, cloud-native databases) Define and manage Infrastructure as Code using Terraform, Ansible, or CloudFormation Demonstrate deep expertise in Linux internals, kernel tuning, scripting (Bash/Python), and networking Design and implement resilient, secure, and scalable infrastructure to support high-traffic applications Architect solutions for high availability, cost-efficiency, and performance optimization at enterprise scale Integrate and operate across multi-cloud or hybrid environments (AWS, Azure, on-prem) Design and maintain ETL/serverless data pipelines using Apache Airflow, AWS Glue, Lambda, and Talend Optimize data pipeline reliability, scheduling, error handling, and CI/CD integration Implement infrastructure-level security controls (CIS hardening, IAM, encryption, firewall rules) Practical experience with compliance frameworks like SOC 2, HIPAA, and internal audits Build and maintain observability systems (Prometheus, Grafana, ELK/Loki, Datadog, CloudWatch) Manage networking stacks (iptables, routing, DNS, SSL, load balancing) and Linux server security Automate provisioning, patching, config management, and multi-stage deployments Manage access control and identity integration with Microsoft Active Directory and Entra ID Provide mentorship and technical leadership to junior DevOps engineers and interns through code reviews, technical sessions, and team-wide knowledge-sharing initiatives Preferred certifications: AWS Solutions Architect, RHCE, or equivalent Experience with container orchestration tools like Kubernetes, ECS, or Docker Swarm Benefits We don’t just hire employees—we invest in people. At GKM IT, we’ve designed a benefits experience that’s thoughtful, supportive, and actually useful. Here’s what you can look forward to: Top-Tier Work Setup You’ll be equipped with a premium MacBook and all the accessories you need. Great tools make great work. Flexible Schedules & Remote Support Life isn’t 9-to-5. Enjoy flexible working hours, emergency work-from-home days, and utility support that makes remote life easier. Quarterly Performance Bonuses We don’t believe in waiting a whole year to celebrate your success. Perform well, and you’ll see it in your pay check—quarterly. Learning is Funded Here Conferences, courses, certifications—if it helps you grow, we’ve got your back. We even offer a dedicated educational allowance. Family-First Culture Your loved ones matter to us too. From birthday and anniversary vouchers (Amazon, BookMyShow) to maternity and paternity leaves—we’re here for life outside work. Celebrations & Gifting, The GKM IT Way Onboarding hampers, festive goodies (Diwali, Holi, New Year), and company anniversary surprises—it’s always celebration season here. Team Bonding Moments We love food, and we love people. Quarterly lunches, dinners, and fun company retreats help us stay connected beyond the screen. Healthcare That Has You Covered Enjoy comprehensive health insurance for you and your family—because peace of mind shouldn’t be optional. Extra Rewards for Extra Effort Weekend work doesn’t go unnoticed, and great referrals don’t go unrewarded. From incentives to bonuses—you’ll feel appreciated. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3 - 5+ years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3 - 5+ years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 1 month ago
6.0 - 10.0 years
9 - 9 Lacs
Hyderābād
On-site
Summary: Charles River Development (CRD) is the FinTech division of State Street. Together with State Street’s Middle and Back-office services, Charles River’s cloud-based Front Office technology forms the foundation of the State Street Alpha® Platform, the first front-to-back solution in the industry. The Alpha Data Platform, lets you load, enrich and aggregate investment data. Leveraging this, our clients will be able to manage multi-asset class data from any service provider or data vendor for a more holistic and integrated view of their holdings. This platform reflects State Street’s years of experience servicing complex instruments for our global client base and our investments in building advanced data management technologies. As the Data Integration Developer/Sr Data Integration Developer you are responsible for overall development life cycle leading to successful delivery and support of Alpha Data Platform Services to clients. Responsibilities: As a Data Integration Developer/Sr Developer, you will be hands-on ETL data pipelines, Snowflake data warehouse, CI/CD deployment Pipelines and data-readiness (data quality) design, development, implementation and address code or data issues. You will conduct query performance tuning/optimizations and data loads to meet the SLAs for both batch and real-time data uses cases. Part of your role will need you to investigate problems and work on your own to solve and innovate clear and concise solutions. As a global team, leverage your collaborative experience to work across regions (APAC, EMEA, and North America) to come up with design standards, high-level design solutions document, cross training and data onboarding activities. You will partake in the creation of artifacts to align with the global SDLC process, Governance clearance, conduct peer code reviews and Unit Test Results, be involved in Code deployments, and create Confluence Jira/Kanban stories. Leverage SQL query debugging and defect issue resolution processes to conduct root cause analysis while working with multiple business/IT stakeholders. Qualifications: Education: B.S. degree (or foreign education equivalent) in Computer Science, Engineering, Mathematics, and Physics or other technical course of study required. MS degree strongly preferred. Experience: A minimum of 6-10 years of experience in data integration/orchestration services Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool Strong SQL knowledge and debugging skills is a must Experience in service architecture and providing data driven solutions for client requirements Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning Exposure to the financial domain knowledge is considered a plus Prior experience with State Street and Charles River Development (CRD) considered a plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Posted 1 month ago
15.0 years
0 Lacs
India
Remote
About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 7–15 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the world’s leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Andhra Pradesh
On-site
Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Data Modeller JD We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: oDesign and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration With Data Architect Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality And Governance Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling Tools: ERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. Databases: SQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing Technologies: Snowflake, Teradata, or similar. ETL Tools: Informatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data Technologies: Hadoop, Spark (optional but preferred). Technologies: Experience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. Requirements Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting). Show more Show less
Posted 1 month ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Design, develop, and maintain ETL processes using Talend. Manage and optimize data pipelines on Amazon Redshift. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in data engineering. Requirements Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. High proficiency in Talend. Strong experience with Amazon Redshift. Expertise in DBT and PySpark. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications Experience with other data engineering tools and frameworks. Knowledge of machine learning frameworks and libraries. Show more Show less
Posted 1 month ago
0.0 - 2.0 years
0 Lacs
Pune, Maharashtra
On-site
Location: Pune, Maharashtra, India Employment Type: Permanent Work Mode: Hybrid Experience required: 5 - 7 Years Description: We are looking for a highly skilled Lead Data Quality Engineer to drive data accuracy, consistency, and integrity across our data ecosystem. The ideal candidate will be responsible for designing, implementing, and overseeing data quality frameworks, ensuring compliance with best practices, and collaborating with cross-functional teams to maintain high data standards. Required Skills SQLETL processesData integration toolsPythonData Quality toolsData Governance toolsCloud platformsBig Data technologiesProblem-solving skillsCommunication skills Responsibilities Develop and implement data quality frameworks, policies, and best practices to enhance data governance and integrity. Conduct data profiling, anomaly detection, and root cause analysis to identify and resolve data quality issues. Implement automated and manual data validation techniques to ensure completeness, consistency, and accuracy. Ensure adherence to data governance principles, regulatory requirements, and industry standards. Work closely with data engineering teams to maintain and enhance data pipelines with embedded quality checks. Develop automated data quality tests, monitoring dashboards, and alerts using SQL, Python, or other data tools. Partner with data engineers, analysts, and business teams to establish quality metrics and ensure alignment on data quality objectives. Track and report data quality KPIs, create dashboards, and provide insights to leadership. Qualifications 7+ years of experience in data quality, data governance, or data engineering roles, with at least 2 years in a leadership capacity. Strong expertise in SQL for data querying, validation, and analysis. Experience with ETL processes, data pipelines, and data integration tools (Airflow, Talend, Informatica, DBT, etc.). Proficiency in Python, PySpark, or other scripting languages for data automation. Hands-on experience with Data Quality and Governance tools (Collibra, Alation, Talend DQ, Great Expectations, etc.). Knowledge of cloud platforms (AWS, Azure, GCP) and modern data architectures. Familiarity with Big Data technologies (Spark, Snowflake, Databricks, etc.) is a plus. Strong problem-solving and analytical skills. Excellent communication and stakeholder management abilities. Ability to lead and mentor a team of data engineers or analysts. Detail-oriented with a proactive approach to data quality management. Experience in regulated industries (finance, healthcare, etc.) with data compliance knowledge (GDPR, HIPAA, etc.) is preferred. Exposure to machine learning data quality frameworks is a plus. Data certification (e.g., CDMP, Collibra Ranger, or similar) is a plus. Preferred Qualifications Experience in regulated industries (finance, healthcare, etc.) with data compliance knowledge (GDPR, HIPAA, etc.) is preferred. Exposure to machine learning data quality frameworks is a plus. Data certification (e.g., CDMP, Collibra Ranger, or similar) is a plus. Job Type: Permanent Pay: ₹447,558.25 - ₹1,200,000.00 per year Work Location: In person Speak with the employer +91 8122359328 Application Deadline: 21/06/2025 Expected Start Date: 24/06/2025
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. Requirements Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting). Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3 -5+ years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5+ years experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Job Description Job Title – ETL Testing – Python & SQL Candidate Specification – 5+ years, Open for Shift – 1PM to 10 PM. ETL (Python) – all 5 days WFO, ETL (SQL) – Hybrid. Location – Chennai. Job Description Experience in ETL testing or data warehouse testing. Strong in SQL Server, MySQL, or Snowflake. Strong in scripting languages Python. Strong understanding of data warehousing concepts, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling. Proficient in writing SQL queries for data validation and reconciliation. Experience with testing tools such as HP ALM, JIRA, TestRail, or similar. Excellent problem-solving skills and attention to detail. Skills Required RoleETL Testing Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills ETL PYTHON SQL Other Information Job CodeGO/JC/185/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 1 month ago
1.0 - 4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Summary: We are looking for a passionate and detail-oriented ETL Developer with 1 to 4 years of experience in building, testing, and maintaining ETL processes. The ideal candidate should have a strong understanding of data warehousing concepts, ETL tools, and database technologies. Key Responsibilities: ✅ Design, develop, and maintain ETL workflows and processes using \[specify tools e.g., Informatica / Talend / SSIS / Pentaho / custom ETL frameworks]. ✅ Understand data requirements and translate them into technical specifications and ETL designs. ✅ Optimize and troubleshoot ETL processes for performance and scalability. ✅ Ensure data quality, integrity, and security across all ETL jobs. ✅ Perform data analysis and validation for business reporting. ✅ Collaborate with Data Engineers, DBAs, and Business Analysts to ensure smooth data operations. Required Skills: • 1-4 years of hands-on experience with ETL tools (e.g., *Informatica, Talend, SSIS, Pentaho*, or equivalent). • Proficiency in SQL and experience working with RDBMS (e.g., SQL Server, Oracle, MySQL, PostgreSQL). • Good understanding of data warehousing concepts and data modeling. • Experience in handling large datasets and performance tuning of ETL jobs. • Ability to work in Agile environments and participate in code reviews. • Ability to learn and work with open-source languages like Node.js and AngularJS. Preferred Skills (Good to Have): • Experience with cloud ETL solutions (AWS Glue, Azure Data Factory, GCP Dataflow). • Exposure to big data ecosystems (Hadoop, Spark). Qualifications: 🎓 Bachelor’s degree in Computer Science, Engineering, Information Technology, or related field. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Job Title: Data Engineer Candidate Specification: 5 + years, Immediate to 30 days. (All 5 Days work from office for 9 Hours). Job Description Experience with any modern ETL tools (PySpark or EMR, or Glue or others). Experience in AWS, programming knowledge in python, Java, Snowflake. Experience in DBT, StreamSets (or similar tools like Informatica, Talend), migration work done in the past. Agile experience is required with Version One or Jira tool expertise. Provide hands-on technical solutions to business challenges & translates them into process/ technical solutions. Good knowledge of CI/CD and DevOps principles. Experience in data technologies - Hadoop PySpark / Scala (Any one) Skills Required RoleData Engineer Industry TypeIT/ Computers - Software Functional AreaIT-Software Required Education B Tech Employment TypeFull Time, Permanent Key Skills PYSPARK. EMR GLUE ETL TOOL AWS CI/CD DEVOPS Other Information Job CodeGO/JC/102/2025 Recruiter NameSheena Rakesh Show more Show less
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Engineer Location: Hyderabad, Kochi, Trivandrum Experience Required: 10-19 Yrs Skills: Primary - Scala, Pyspark, Python / Secondary - ETL, SQL, Azure Role Proficiency The role demands expertise in building robust, scalable data pipelines that support ingestion, wrangling, transformation, and integration of data from multiple sources. The ideal candidate should have hands-on experience with ETL tools (e.g., Informatica, AWS Glue, Databricks, GCP DataProc), and strong programming skills in Python, PySpark, SQL, and optionally Scala. Proficiency across various data domains and familiarity with modern data warehouse and lakehouse architectures (Snowflake, BigQuery, Delta Lake, Lakehouse) is essential. A solid understanding of DevOps and infrastructure cost optimization is required. Key Responsibilities & Outcomes Technical Development Develop high-performance data pipelines and applications. Optimize development using design patterns and reusable solutions. Create and tune code using best practices for performance and scalability. Develop schemas, data models, and data storage solutions (SQL/NoSQL/Delta Lake). Perform debugging, testing, and validation to ensure solution quality. Documentation & Design Produce high-level and low-level design (HLD, LLD, SAD) and architecture documentation. Prepare infra costing, source-target mappings, and business requirement documentation. Contribute to and govern documentation standards/templates/checklists. Project & Team Management Support Project Manager in planning, delivery, and sprint execution. Estimate effort and provide input on resource planning. Lead and mentor junior team members, define goals, and monitor progress. Monitor and manage defect lifecycle including RCA and proactive quality improvements. Customer Interaction Gather and clarify requirements with customers and architects. Present design alternatives and conduct product demos. Ensure alignment with customer expectations and solution architecture. Testing & Release Design and review unit/integration test cases and execution strategies. Provide support during system/integration testing and UAT. Oversee and execute release cycles and configurations. Knowledge Management & Compliance Maintain compliance with configuration management plans. Contribute to internal knowledge repositories and reusable assets. Stay updated and certified on relevant technologies/domains. Measures of Success (KPIs) Adherence to engineering processes and delivery schedules. Number of post-delivery defects and non-compliance issues. Reduction in recurring defects and faster resolution of production bugs. Timeliness in detecting, responding to, and resolving pipeline/data issues. Improvements in pipeline efficiency (e.g., runtime, resource utilization). Team engagement and upskilling; completion of relevant certifications. Zero or minimal data security/compliance breaches. Expected Deliverables Code High-quality data transformation scripts and pipelines. Peer-reviewed, optimized, and reusable code. Documentation Design documents, technical specifications, test plans, and infra cost estimations. Configuration & Testing Configuration management plans and test execution results. Knowledge Sharing Contributions to SharePoint, internal wikis, client university platforms. Skill Requirements Mandatory Technical Skills Languages : Python, PySpark, Scala ETL Tools : Apache Airflow, Talend, Informatica, AWS Glue, Databricks, DataProc Cloud Platforms : AWS, GCP, Azure (esp. BigQuery, DataFlow, ADF, ADLS) Data Warehousing : Snowflake, BigQuery, Delta Lake, Lakehouse architecture Performance Tuning : For large-scale distributed systems and pipelines Additional Skills Experience in data model design and optimization. Good understanding of data schemas, window functions, and data partitioning strategies. Awareness of data governance, security standards, and compliance. Familiarity with DevOps, CI/CD, infrastructure cost estimation. Certifications (Preferred) Cloud certifications (e.g., AWS Data Analytics, GCP Data Engineer) Informatica or Databricks certification Domain-specific certifications based on project/client need Soft Skills Strong analytical and problem-solving capabilities Excellent communication and documentation skills Ability to work independently and collaboratively in cross-functional teams Stakeholder management and customer interaction Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
India
Remote
One of our clients is looking for a Talend ETL Developer – Remote role (with annual office visits), with 6 to 14 years of hands-on experience in Talend ETL development, including work with Kinaxis Maestro Planning Server, SAP/APO, and other ERP systems. Preferred Work Locations - Kochi, Chennai, Pune Key Responsibilities: Design and develop Talend jobs for loading data into Kinaxis Maestro Planning Server . Ensure adherence to coding standards by performing unit testing, documenting results, and sharing with business users for validation. Deploy Talend jobs to the Talend Administration Center (TAC) from the Nexus repository . Schedule jobs using cron-based or file-based triggers within TAC. Monitor job performance and create execution plans for sequential or parallel processing in TAC. Integrate Talend jobs with various systems, including XML, JSON, REST APIs, SFTP , and flat files. Use Kinaxis connectors to extract and transform data from enterprise systems like SAP , APO , and data lakes. Collaborate with cross-functional teams to ensure data accuracy, optimize performance, and align with business requirements. Required Skills & Experience: 6+ years of hands-on experience with Talend ETL development . Proficiency in integrating Talend with systems such as SAP , Data Warehouses , and other enterprise systems. Strong expertise in ETL design , data transformations, and working with XML/JSON formats. Experience deploying Talend jobs to Nexus and managing them in TAC Job Conductor . Strong background in job scheduling , monitoring, and execution planning in TAC . Experience with Kinaxis extractors and ERP data sourcing from SAP/APO . Working knowledge of SFTP components and file-based integration. Strong proficiency in SQL and experience working with relational databases for efficient data manipulation. Experience in using Talend's API components to handle GET and POST requests for integration. Knowledge of Java for general coding tasks and automation within ETL workflows. Experience with version control using Git for managing code repositories. Note - This Job Post is Valid for Only 22 Hours. Please Apply Quickly Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company : Fives India Engineering & Projects Pvt. Ltd. Job Title : Data Analyst/Senior Data Analyst (BI developer) Job Location : Chennai, Tamil Nadu, India Job Department : IT Educational Qualification : BE/B.Tech/MCA from a reputed Institute in Computer Science or related field Work Experience : 4 – 8 years Job Description Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst/ Senior Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 4 – 8 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular Show more Show less
Posted 1 month ago
0.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Position Summary Company : Fives India Engineering & Projects Pvt. Ltd. Job Title : Data Analyst/Senior Data Analyst (BI developer) Job Location : Chennai, Tamil Nadu, India Job Department : IT Educational Qualification : BE/B.Tech/MCA from a reputed Institute in Computer Science or related field Work Experience : 4 – 8 years Job Description : Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst/ Senior Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 4 – 8 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less
Posted 1 month ago
12.0 - 20.0 years
35 - 50 Lacs
Bengaluru
Hybrid
Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France