Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Job Title: Data Analyst / Technical Business Analyst Job Summary We are looking for a skilled Data Analyst to support a large-scale data migration initiative within the banking and insurance domain. The role involves analyzing, validating, and transforming data from legacy systems to modern platforms, ensuring regulatory compliance, data integrity, and business continuity. Key Responsibilities Collaborate with business stakeholders, data architects, and IT teams to gather and understand data migration requirements. Analyze legacy banking and insurance systems (e.g., core banking, policy admin, claims, CRM) to identify data structures and dependencies. Work with large-scale datasets and understand big data architectures (e.g., Hadoop, Spark, Hive) to support scalable data migration and transformation. Perform data profiling, cleansing, and transformation using SQL and ETL tools, with the ability to understand and write complex SQL queries and interpret the logic implemented in ETL workflows. Develop and maintain data mapping documents and transformation logic specific to financial and insurance data (e.g., customer KYC, transactions, policies, claims). Validate migrated data against business rules, regulatory standards, and reconciliation reports. Support UAT by preparing test cases and validating migrated data with business users. Ensure data privacy and security compliance throughout the migration process. Document issues, risks, and resolutions related to data quality and migration. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, Finance, or a related field. 5+ years of experience in data analysis or data migration projects in banking or insurance. Strong SQL skills and experience with data profiling and cleansing. Familiarity with ETL tools (e.g., Informatica, Talend, SSIS) and data visualization tools (e.g., Power BI, Tableau). Experience working with big data platforms (e.g., Hadoop, Spark, Hive) and handling large volumes of structured and unstructured data. Understanding of banking and insurance data domains (e.g., customer data, transactions, policies, claims, underwriting). Knowledge of regulatory and compliance requirements (e.g., AML, KYC, GDPR, IRDAI guidelines). Excellent analytical, documentation, and communication skills. Preferred Qualifications Experience with core banking systems (e.g., Finacle, Flexcube) or insurance platforms Exposure to cloud data platforms (e.g.,AWS, Azure, GCP). Experience working in Agile/Scrum environments. Certification in Business Analysis (e.g., CBAP, CCBA) or Data Analytics. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Show more Show less
Posted 1 week ago
20.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Senior Data Solution Architect Job Summary: The Senior Data Solution Architect is a visionary and technical leader responsible for designing and guiding enterprise-scale data solutions. Leveraging 20+ years of experience, this individual works closely with business and IT stakeholders to deliver scalable, secure, and high-performing data architectures that support strategic goals, data-driven innovation, and digital transformation. This role encompasses solution design, platform modernization, cloud data architecture, and deep integration with enterprise systems. Key Responsibilities: Solution Architecture & Design Lead the end-of-the-end architecture of complex data solutions across domains including analytics, AI/ML, MDM, and real-time processing. Design robust, scalable, and future-ready data architectures using modern technologies (e.g., cloud data platforms, streaming, NoSQL, graph databases). Deliver solutions that balance performance, scalability, security, and cost-efficiency. Enterprise Data Integration Architect seamless data integration across legacy systems, SaaS platforms, IoT, APIs, and third-party data sources. Define and implement enterprise-wide ETL/ELT strategies using tools like Informatica, Talend, DBT, Azure Data Factory, or AWS Glue. Support real-time and event-driven architecture with tools such as Kafka, Spark Streaming, or Flink. Cloud Data Platforms & Infrastructure Design cloud-native data solutions on AWS, Azure, or GCP (e.g., Redshift, Snowflake, BigQuery, Databricks, Synapse). Lead cloud migration strategies from legacy systems to modern, cloud-based data architectures. Define standards for cloud data governance, cost management, and performance optimization. Data Governance, Security & Compliance Partner with governance teams to enforce enterprise data governance frameworks. Ensure solutions comply with regulations such as GDPR, HIPAA, CCPA, and industry-specific mandates. Embed security and privacy by design in data architectures (encryption, role-based access, masking, etc.). Technical Leadership & Stakeholder Engagement Serve as a technical advisor to CIOs, CDOs, and senior business executives on data strategy and platform decisions. Mentor architecture and engineering teams; provide guidance on solution patterns and best practices. Facilitate architecture reviews, proof-of-concepts (POCs), and technology evaluations. Innovation & Continuous Improvement Stay abreast of emerging trends in data engineering, AI, data mesh, data fabric, and edge computing. Evaluate and introduce innovative tools and patterns (e.g., serverless data pipelines, federated data access). Drive architectural modernization, legacy decommissioning, and platform simplification. Qualifications: Education: Bachelor’s degree in computer science, Engineering, Information Systems, or related field; Master’s or MBA preferred. Experience: 20+ years in IT with at least 10 years in data architecture or solution architecture roles. Demonstrated experience in large-scale, complex data platform architecture and enterprise transformations. Deep experience with multiple database technologies (SQL, NoSQL, columnar, time series). Strong programming/scripting background (e.g., Python, Scala, Java, SQL). Proven experience architecting on at least one major cloud provider (AWS, Azure, GCP). Familiarity with DevOps, CI/CD, and DataOps practices. Preferred Certifications: AWS/Azure/GCP Solution Architect (Professional level preferred) TOGAF or Zachman Framework Certification Snowflake/Databricks Certified Architect CDMP (Certified Data Management Professional) or DGSP Key Competencies: Strategic and conceptual thinking with the ability to translate business needs into technical solutions. Exceptional communication, presentation, and negotiation skills. Leadership in cross-functional teams and matrix environments. Deep understanding of business processes, data monetization, and digital strategy. Success Indicators: Delivery of transformative data platforms that enhance analytics and decision-making. Improved data integration, quality, and access across the enterprise. Successful migration to cloud-native or hybrid architectures. Reduction of technical debt and legacy system dependencies. Increased reuse of solution patterns, accelerators, and frameworks. Show more Show less
Posted 1 week ago
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Data Engineering Pipeline Development Design implement and maintain ETL processes using ADF and ADB Create and manage views in ADB and SQL for efficient data access Optimize SQL queries for large datasets and high performance Conduct end-to-end testing and impact analysis on data pipelines Optimization Performance Tuning Identify and resolve bottlenecks in data processing Optimize SQL queries and Delta Tables for fast data processing Data Sharing Integration Implement Delta Share, SQL Endpoints, and other data sharing methods Use Delta Tables for efficient data sharing and processing API Integration Development Integrate external systems through Databricks Notebooks and build scalable solutions Experience in building APIs (Good to have) Collaboration Documentation Collaborate with teams to understand requirements and design solutions Provide documentation for data processes and architectures
Posted 1 week ago
3.0 - 7.0 years
9 - 14 Lacs
Bengaluru
Work from Office
3 -4+ years of overall industry experience in designing, developing, anddeploying ETL solutions using industry standard ETL tools 1+ years of hands-onexperience in developing and productionizing solutions with Talend DataIntegration Extensive experience in designing end-to-end transformations andworkflows using Talend Data Integration, as per requirement specifications Good communications skills in spoken and written English
Posted 1 week ago
2.0 - 7.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
CirrusLabs Private Limited is looking for DWH ETL Developer to join our dynamic team and embark on a rewarding career journey Consulting with data management teams to get a big-picture idea of the companys data storage needs. Presenting the company with warehousing options based on their storage needs. Designing and coding the data warehousing system to desired company specifications. Conducting preliminary testing of the warehousing environment before data is extracted. Extracting company data and transferring it into the new warehousing environment. Testing the new storage system once all the data has been transferred. Troubleshooting any issues that may arise. Providing maintenance support.
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary: We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 8 - 12+ years Work Location: Hyderabad (Hybrid) Mandatory skills: Python, SQL, Snowflake Contract to Hire - 6+ months Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 8 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and related frameworks. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for ETL Test Engineer at Tata Consultancy Services. Hiring For: ETL Test Engineer Interview date: 14-June-25 In-person Drive Location: Bangalore Experience: 4-10 years Must Have: 1. SQL - Expert level of knowledge in core concepts of SQL and query. 2.Lead and mentor a team of ETL testers, providing technical guidance, training, and support in ETL tools, SQL, and test automation frameworks. 3.Create and review complex test cases, test scripts, and test data for ETL processes. 4. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 5.Execute test cases, validate data transformations, and ensure data accuracy and consistency across source and target systems 6. Experience in query optimization, stored procedures/views and functions. 7. Strong familiarity of data warehouse projects and data modeling. 8. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 9. Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, Power BI/Cognos/Informatica, etc.) 10.Develop and maintain ETL test automation frameworks to enhance testing efficiency and coverage. 11. Integrate automated tests into the CI/CD pipeline to ensure continuous validation of ETL processes. 12. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 13. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 14. Communication - Good communication skills to understand and collaborate with all the stake holders within the project If you are interested in this exciting opportunity, please share your updated resume on saranya.devi3@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: University/Institute name: Current Organization Willing to relocate Bangalore : Total Experience: Relevant Experience in (ETL Test Engineer): Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Available for In-Person interview on 14-June-25: Timings: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Note: only Eligible candidates with Relevant experience will be contacted further. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Azure Data Engineer Location: Gurugram We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. 🧠 What You'll Do 🔹 Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory 🔹 Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable 🔹 Work on modern data lakehouse architectures and contribute to data governance and quality frameworks 🎯 Tech Stack ☁️ Azure | 🧱 Databricks | 🐍 PySpark | 📊 SQL 👤 What We’re Looking For ✅ 3+ years of experience in data engineering or analytics engineering ✅ Hands-on with cloud data platforms and large-scale data processing ✅ Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes Best Regards, Santosh Cherukuri Email: scherukuri@bayonesolutions.com Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We have an exciting job opportunity - Lead Data Engineer : Snowflake Experience - 7 years to 10 years only Mandate Skills - min of 3 yrs+ experience with Snowflake - either migrating from another DB to SF or pulling data into SF. SQL, any ETL tool - SSIS, Talend, Informatica, Data Bricks, ADF (preferred), etc, Team Handling exp Location - Hyderabad / Chennai Mode - Hybrid Notice Period - Immediate Joiner to 15 Days Job Summary: · We are seeking an experienced Lead Snowflake Data Engineer to join our Data & Analytics team. This role involves designing, implementing, and optimizing Snowflake-based data solutions while providing strategic direction and leadership to a team of junior and mid-level data engineers. The ideal candidate will have deep expertise in Snowflake, cloud data platforms, ETL/ELT processes, and Medallion data architecture best practices. The lead data engineer role has a strong focus on performance optimization, security, scalability, and Snowflake credit control and management. This is a tactical role requiring independent in-depth data analysis and data discovery to understand our existing source systems, fact and dimension data models, and implement an enterprise data warehouse solution in Snowflake. Essential Functions and Tasks: · Lead the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team. · Architect and implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies. · Optimize Snowflake database performance, storage, and security. · Provide guidance on Snowflake best practices · Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers, to define and implement data solutions. · Ensure data quality, integrity, and governance across the organization. · Provide technical leadership and mentorship to junior and mid-level data engineers. ·Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform. Education and Experience Requirements: · Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. · 7+ years of experience in-depth data engineering, with at least 3+ minimum years of dedicated experience engineering solutions in a Snowflake environment. ·Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. ·Strong experience with cloud platforms (preference to Azure) and their data services. · Proficiency in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran. · Hands-on experience with scripting languages like Python for data processing. · Strong understanding of data governance, security, and compliance best practices. · Snowflake SnowPro certification; preference to the engineering course path. · Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). · Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. · Familiarity with BI and visualization tools such as PowerBI Knowledge, Skills, and Abilities: · Familiarity working in an agile scrum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives. · Ability to self-manage large complex deliverables and document user stories and tasks through Azure DevOps. · Personal accountability to committed sprint user stories and tasks · Strong analytical and problem-solving skills with the ability to handle complex data challenges · Ability to read, understand, and apply state/federal laws, regulations, and policies. · Ability to communicate with diverse personalities in a tactful, mature, and professional manner. · Ability to remain flexible and work within a collaborative and fast paced environment. · Understand and comply with company policies and procedures. · Strong oral, written, and interpersonal communication skills. · Strong time management and organizational skills. About Our Client: Our client is a leading business solutions provider for facility-based physicians practicing anesthesia, emergency medicine, hospital medicine, and now radiology, through the recent combining of forces with Advocate RCM. Focused on Revenue Cycle Management and Advisory services.Having grown consistently every year they have now grown to over 5000 employees headquartered in Dallas, US. Kindly share your updated resume ankita.jaiswal@firstwave-tech.com Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
India
On-site
Job Summary: We are seeking a skilled Semarchy MDM Consultant/Developer to join our data management team. The ideal candidate will have hands-on experience with Semarchy xDM and a deep understanding of MDM concepts, data modeling, data integration, and data governance. Key Responsibilities: Design, develop, and implement Master Data Management (MDM) solutions using Semarchy xDM. Develop and configure data models, entities, match & merge rules, workflows, and data validations within the Semarchy platform. Integrate Semarchy with various data sources (ETL tools, APIs, databases). Collaborate with business analysts and data stewards to gather requirements and implement effective MDM strategies. Ensure data quality, consistency, and governance across business domains. Create documentation for technical designs, data flows, configurations, and operational processes. Monitor and optimize MDM performance and troubleshoot issues. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 4+ years of experience in Master Data Management (MDM) implementations. 2+ years of hands-on experience in Semarchy xDM development and configuration. Strong SQL skills and knowledge of relational databases (e.g., Oracle, SQL Server, PostgreSQL). Experience with data integration tools (e.g., Talend, Informatica, Apache NiFi) is a plus. Understanding of MDM domains like Customer, Product, Supplier, etc. Experience in REST/SOAP APIs, data profiling, and data quality tools is an advantage. Good understanding of data governance, stewardship, and metadata management. Excellent problem-solving skills and communication abilities. Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: ETL Test Engineer Experience range: 4-10 years Location: Current location must be Bangalore ONLY NOTE: Candidate interested for a walk-in drive in Bangalore must apply Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Infrastructure Lead/Architect Job Type: Full-Time Location: On-site Hyderabad, Pune or New Delhi Job Summary Join our customer's team as an Infrastructure Lead/Architect and play a pivotal role in architecting, designing, and implementing next-generation cloud infrastructure solutions. You will drive cloud and data platform initiatives, ensure system scalability and security, and act as a technical leader, shaping the backbone of our customers’ mission-critical applications. Key Responsibilities Architect, design, and implement robust, scalable, and secure AWS cloud infrastructure utilizing services such as EC2, S3, Lambda, RDS, Redshift, and IAM. Lead the end-to-end design and deployment of high-performance, cost-efficient Databricks data pipelines, ensuring seamless integration with business objectives. Develop and manage data integration workflows using modern ETL tools in combination with Python and Java scripting. Collaborate with Data Engineering, DevOps, and Security teams to build resilient, highly available, and compliant systems aligned with operational standards. Act as a technical leader and mentor, guiding cross-functional teams through infrastructure design decisions and conducting in-depth code and architecture reviews. Oversee project planning, resource allocation, and deliverables, ensuring projects are executed on-time and within budget. Proactively identify infrastructure bottlenecks, recommend process improvements, and drive automation initiatives. Maintain comprehensive documentation and uphold security and compliance standards across the infrastructure landscape. Required Skills and Qualifications 8+ years of hands-on experience in IT infrastructure, cloud architecture, or related roles. Extensive expertise with AWS cloud services; AWS certifications are highly regarded. Deep experience with Databricks, including cluster deployment, Delta Lake, and machine learning integrations. Strong programming and scripting proficiency in Python and Java. Advanced knowledge of ETL/ELT processes and tools such as Apache NiFi, Talend, Airflow, or Informatica. Proven track record in project management, leading cross-functional teams; PMP or Agile/Scrum certifications are a plus. Familiarity with CI/CD workflows and Infrastructure as Code tools like Terraform and CloudFormation. Exceptional problem-solving, stakeholder management, and both written and verbal communication skills. Preferred Qualifications Experience with big data platforms such as Spark or Hadoop. Background in regulated environments (e.g., finance, healthcare). Knowledge of Kubernetes and AWS container orchestration (EKS). Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 06/09/2025 Job Type Full time Industry Technology Work Experience 5+ years City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600096 Job Description What you’ll be working on: Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity. Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Learn something new everyday What we are looking for: Bachelor's or master’s degree in technical or business discipline or related experience; Master's Degree preferred. 4+ years hands-on experience effectively managing data platforms, data tools and/or depth in data management technologies Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience with orchestration tools like Airflow. Experience with any of the ETL tools like Talend, Informatica etc. Experience in Data Warehouse solutions like Snowflake,Redshift. Exposure to data visualization tools (Tableau, Sisense, Looker, Metabase etc.) Knowledge of Github, JIRA is a plus. Familiar with data warehouse & data governance Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc) is a plus. Requirements Knowledge/Skills/Abilities/Behaviours: A “build-test-measure-improve” mentality and are driven to motivate and lead teams to achieve impactful deliverables Passion for operational efficiency, quantitative performance metrics and process orientation Working knowledge of project planning methodologies, IT standards and guidelines. Customer passion, business focus and the ability to negotiate, facilitate and build consensus. The ability to promote a team environment across a large set of separate agile teams and stakeholders Experience with or knowledge of Agile Software Development methodologies Benefits Work at SquareShift: We offer a stimulating atmosphere where your efforts will have a significant impact on our company’s success. We are a fun, client focussed, results-driven company that centers on developing high quality software, not work schedules and dress codes. We are driven by people who have a passion for technology, innovation and we are committed to continuous improvement. This role excites you to join our team? Apply on the link below!
Posted 1 week ago
10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-45392-2025 Description & Requirements Introduction: A Career at HARMAN HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About the Role We are seeking an experienced “Azure Data Architect” who will develop and implement data engineering project including enterprise data hub or Big data platform. Develop and implement data engineering project including data lake house or Big data platform What You Will Do Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as AWS 3/ AWS glue , AWS Redshift, Iceberg/parquet file format Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. What You Need 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Develop and implement data engineering project including data lakehouse or Big data platform Develop and implement data engineering project including data lakehouse or Big data platform What is Nice to Have Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Snowflake Certified in SnowPro Advanced Certification Ability to define reference data architecture Cloud native data platform experience in AWS or Microsoft stack Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders Build re-usable Methodologies, Pipelines & Models What Makes You Eligible Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.). Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Posted 1 week ago
3.0 years
0 Lacs
India
Remote
Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Project Manager – Data Engineering Location: Pune (with travel to the Middle East as required) Experience: 7–10 Years Employment Type: Full-time --- About the Role We are looking for an experienced and hands-on Project Manager in Data Engineering who can lead the end-to-end delivery of data pipeline projects across Azure and AWS environments. The ideal candidate will bring strong technical depth in data engineering along with client-facing and project execution capabilities. --- Key Responsibilities · Lead and manage multiple data engineering projects across Azure and AWS ecosystems. · Gather client requirements and translate them into technical specifications and delivery roadmaps. · Design, oversee, and ensure successful implementation of scalable data pipelines, ETL processes, and data integration workflows. · Collaborate with internal data engineers, BI developers, and client stakeholders to ensure smooth project execution. · Ensure adherence to timelines, quality standards, and cost constraints. · Identify project risks, dependencies, and proactively resolve issues. · Own the client relationship from initiation to delivery – conduct regular check-ins, demos, and retrospectives. · Stay updated on emerging tools and best practices in the data engineering space and recommend their adoption. · Lead sprint planning, resource allocation, and tracking using Agile or hybrid methodologies. --- Required Skills & Experience · 7–10 years of total experience in data engineering and project delivery. · Strong experience in Azure Data Services – Azure Data Factory, Synapse, Databricks, Data Lake, etc. · Working knowledge of AWS data tools such as Glue, Redshift, S3, and Lambda functions. · Good understanding of data modeling, data warehousing, and pipeline orchestration. · Experience with tools such as Talend, Airflow, DBT, or other orchestration platforms is a plus. · Proven track record of managing enterprise data projects from requirement gathering to deployment. · Client-facing experience with strong communication and stakeholder management skills. · Strong understanding of project management methodologies and tools (e.g., JIRA, Trello, MS Project). --- Preferred Qualifications · Bachelor's or Master’s degree in Computer Science, Information Systems, or related field. · PMP or PRINCE2 certification is a plus. · Experience working with Middle East clients is an added advantage. · Exposure to modern data platforms, real-time data processing, or big data tools is a plus. --- Additional Details · This is a Pune-based role with expected travel to Middle East locations based on project needs. · Should be open to handling cross-functional teams and multiple projects simultaneously. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
India
Remote
Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary We are seeking an experienced Database Developer with strong expertise in Relational Database Management Systems (RDBMS), particularly Oracle writing complex stored procedures, triggers, and functions. You will work closely with cross-functional teams to design, develop, optimize, and maintain scalable and efficient database solutions. Key Responsibilities Design, develop, and implement database structures and solutions for high-performance data processing and reporting. Work with Oracle RDBMS to write and optimize complex SQL queries, stored procedures, triggers, and functions. Basic knowledge on Talend to ensure efficient data integration, transformation, and loading. Collaborate with data architects and business stakeholders to translate requirements into technical solutions. Design, implement, and maintain complex database structures, ensuring consistency, reliability, and high availability. Troubleshoot database issues, including performance, security, and availability, and take necessary corrective actions. Perform database tuning to optimize the performance of queries, indexes, and system resources. Maintain data integrity and support data security protocols in line with industry best practices. Develop and manage database migration strategies, ensuring smooth data transitions between systems. Document and standardize coding practices, procedures, and database workflows. Monitor database system performance and create reports for operational monitoring and optimization. Collaborate with software development teams to ensure that database solutions align with application architecture and system requirements. Skills and Qualifications: 6 years of hands-on experience working with RDBMS such as Oracle. Proficient in writing and optimizing SQL queries, stored procedures, triggers, and functions in Oracle. Strong experience in database design, including normalization, indexing, and partitioning for performance optimization. Experience with Oracle PL/SQL and database tuning to improve query performance. Familiarity with database replication, data migrations, and backup and recovery strategies. Understanding of data security protocols and compliance standards (e.g., GDPR, HIPAA). Ability to troubleshoot complex database issues related to performance, integrity, and security. Strong analytical and problem-solving skills, with the ability to handle complex data challenges. Excellent communication skills and the ability to work well with both technical and non-technical teams. Familiarity with database administration concepts and monitoring tools. Must Have End to end Web Analytics Implementation project activation Defining the Technical Implementation and Data layer Architecture during tag implementation Integrating other solutions like Consent Management (OneTrust), Observepoint, ETL tools (Alteryx) with the Google Analytics Platform Gathering the technical requirements from the client and creating the documentation like SDR, Tech Spec, MRD Ability to plan and implement methods to measure experiences, including Tag Management Solutions like Tealium iQ (primarily), Adobe Launch, Adobe Analytics, Dynamic Tag Manager, Ensighten, Google Tag Manager Understand and use multitude of tag managers and writing JavaScript code to realize client driven business requirements Responsible for site optimization with an ability to solution design and implement the analytics strategy and technology needed to gain and stitch together insights into both online and physical location activity Experienced in Marketing Performance analysis i.e. data aggregation (leveraging marketing & click-stream APIs, data cleaning & transformation), analysis & segmentation, targeting & integration Experienced in A/B testing, MVT/Optimization framework(s) using tools like Adobe Target Develop the strategy of enterprise level solutions as well as architecting extensible and maintainable solutions utilizing the Adobe and Google analytics platforms Excellent understanding of digital analytics specially Clickstream Data Ability to create data visualization dashboards specially on Workspace, Data Studio, MS Excel and Adobe Report builder Agile method understanding About you: Analytics Platforms - Google Analytics, Adobe Analytics/Omniture SiteCatalyst, Matomo/Piwik Tag Managers Adobe Launch/DTM, Tealium IQ, Google Tag Manager, Piwik Pro, Signal/Bright Tag Optimization Platform Adobe Target, Google Optimize, Optimizely 1 years in a client facing role for solutioning and / or evangelizing technology approaches. Programming Languages - JavaScript, jQuery Markup Languages - HTML, CSS Good to have EQUAL OPPORTUNITY Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidates merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics. Locations - Bangalore, KA, IN Show more Show less
Posted 1 week ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Role: Senior Data Engineer Location: Indore Job Description: Build and maintain data pipelines for ingesting and processing structured and unstructured data. Ensure data accuracy and quality through validation checks and sanity reports. Improve data infrastructure by automating manual processes and scaling systems. Support internal teams (Product, Delivery, Onboarding) with data issues and solutions. Analyze data trends and provide insights to inform key business decisions. Collaborate with program managers to resolve data issues and maintain clear documentation. Must-Have Skills: Proficiency in SQL, Python (Pandas, NumPy), and R Experience with ETL tools (e.g., Apache NiFi, Talend, AWS Glue) Cloud experience with AWS (S3, Redshift, EMR, Athena, RDS) Strong understanding of data modeling, warehousing, and data validation Familiarity with data visualization tools (Tableau, Power BI, Looker) Experience with Apache Airflow, Kubernetes, Terraform, Docker Knowledge of data lake architectures, APIs, and custom data formats (JSON, XML, YAML) Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities We are seeking a skilled and experienced Cognos TM1 Developer with a strong background in ETL processes and Python development. The ideal candidate will be responsible for designing, developing, and supporting TM1 solutions, integrating data pipelines, and automating processes using Python. This role requires strong problem-solving skills, business acumen, and the ability to work collaboratively with cross-functional teams Preferred Education Master's Degree Required Technical And Professional Expertise 4+ years of hands-on experience with IBM Cognos TM1 / Planning Analytics. Strong knowledge of TI processes, rules, dimensions, cubes, and TM1 Web. Proven experience in building and managing ETL pipelines (preferably with tools like Informatica, Talend, or custom scripts). Proficiency in Python programming for automation, data processing, and system integration. Experience with REST APIs, JSON/XML data formats, and data extraction from external sources Preferred Technical And Professional Experience strong SQL knowledge and ability to work with relational databases. Familiarity with Agile methodologies and version control systems (e.g., Git). 3.Excellent analytical, problem-solving, and communication skills Show more Show less
Posted 1 week ago
12.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Define and design future state data architecture for HR reporting, forecasting and analysis products. Partner with Technology, Data Stewards and various Products teams in an Agile work stream while meeting program goals and deadlines. Engage with line of business, operations, and project partners to gather process improvements. Lead to design / build new models to efficiently deliver the financial results to senior management. Evaluate Data related tools and technologies and recommend appropriate implementation patterns and standard methodologies to ensure our Data ecosystem is always modern. Collaborate with Enterprise Data Architects in establishing and adhering to enterprise standards while also performing POCs to ensure those standards are implemented. Provide technical expertise and mentorship to Data Engineers and Data Analysts in the Data Architecture. Develop and maintain processes, standards, policies, guidelines, and governance to ensure that a consistent framework and set of standards is applied across the company. Create and maintain conceptual / logical data models to identify key business entities and visual relationships. Work with business and IT teams to understand data requirements. Maintain a data dictionary consisting of table and column definitions. Review data models with both technical and business audience What You'll Bring Essential Education Minimum of a Bachelor's degree in Computer science, Engineering or a similar field Additional Certification in Data Management or cloud data platforms like Snowflake preferred Essential Experience & Job Requirements 12+ years of IT experience with major focus on data warehouse/database related projects Expertise in cloud databases like Snowflake, Redshift etc. Expertise in Data Warehousing Architecture; BI/Analytical systems; Data cataloguing; MDM etc Proficient in Conceptual, Logical, and Physical Data Modelling Proficient in documenting all the architecture related work performed. Proficient in data storage, ETL/ELT and data analytics tools like AWS Glue, DBT/Talend, FiveTran, APIs, Tableau, Power BI, Alteryx etc Experience in building Data Solutions to support Comp Benchmarking, Pay Transparency / Pay Equity and Total Rewards use cases preferred. Experience with Cloud Big Data technologies such as AWS, Azure, GCP and Snowflake a plus Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) a plus Excellent written, oral communication and presentation skills to present architecture, features, and solution recommendations is a must Additional info YOU’RE GOOD AT Design, document & train the team on the overall processes and process flows for the Data architecture. Resolve technical challenges in critical situations that require immediate resolution. Develop relationships with external stakeholders to maintain awareness of data and security issues and trends. Review work from other tech team members and provide feedback for growth. Implement Data security policies that align with governance objectives and regulatory requirements. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 1 week ago
2.0 years
8 - 9 Lacs
Gurgaon
Remote
We are seeking a skilled and motivated Data Engineer with exceptional SQL expertise, and Talend experience to join our dynamic team. As a Data Engineer, you will play a critical role in designing, developing, and maintaining data integration solutions that drive business intelligence and decision-making. Qualifications : 2+ years experience as a Data Engineer Hands-on experience with Talend and advanced SQL Good experience with Python Experience with relational databases (e.g., MySQL, SQL Server, Oracle) and data warehousing concepts Experience with Informatica/Pentaho Data Integration/AWS Glue/IBM DataStage/ Snowflake/ GCP is good to have Strong problem-solving skills with the ability to analyze and interpret complex data Extensive hands-on experience implementing data migration and data processing Good exposure to data warehousing and data mining Why join us? You'll have the opportunity to collaborate on multiple global projects, essentially gaining experience across multiple technologies simultaneously More reasons to join us: 4.2 Glassdoor Rating Fully remote work environment Exposure to cutting-edge technologies and international clients spanning various industries Opportunities to engage in diverse projects and technologies, with cross-domain training and support for career or domain transitions, including certifications reimbursement Profitable and bootstrapped company Flexible working hours with a 5-day workweek Over 30 paid leaves annually Merit-based compensation with above-average annual increments Sponsored team luncheons, festive celebrations, and semi-annual retreats Candidate Source: Referral Experience Level: 3-5 Years
Posted 1 week ago
3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Title : Talend Engineer Location : Mumbai - Hybrid (3 days in office) Job Type : Permanent Years of Experience : 5+ Requirements: 1.Minimum 3 years of experience in Talend development 2.Expertise in creating Talend ETL jobs utilizing File and Database components 3.Practical experience in using Talend's API components to handle GET and POST requests for data communication and integration. 4.Should have experience or a good understanding of AWS components and how to utilize them in Talend ETL processes. 5.Experience in migrating and deploying ETL pipelines across different environments. 6.Proficient in SQL and experienced in working with relational databases to ensure efficient data manipulation and retrieval. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida · Job Title: Talend ID Developer · Location: Pune ,Hyderabad (Hybrid) · Experience: 7+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: Key skills required for the job are: Talend DI (Mandatory) and having good exposure to RDBMS databases like Oracle, Sql server. · 3+ years of experience in implementation of ETL projects in a large-scale enterprise data warehouse environment and at least one Successful implementation Talend with DWH is must. · As a Senior Developer, candidate is responsible for development, support, maintenance, and implementation of a complex project module. Candidate is expected to have depth of knowledge of specified technological area, which includes knowledge of applicable processes, methodologies, standards, products and frameworks. · She/he should have experience in application of standard software development principles using Talend. · She/he should be able to work as an independent team member, capable of applying judgment to plan and execute HWB tasks. · Build reusable Talend jobs, routines, and components to support data integration, quality and transformations Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2