Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 6.0 years
0 - 3 Lacs
Pune, Chennai, Bengaluru
Hybrid
Experience - 4 years - 6 years Location - Chennai / Pune / Mumbai / Bangalore Strong in-depth knowledge of databases and database concepts to support Design/development of Datawarehouse/Datalake application Analyze the business requirements and work with the business and data modeler to support dataflow from source to target destination. Follow release and change processes: distribution of software builds and releases to development , test environments and production. Adheres to project's SDLC process (Agile), participates in Team discussion, scrum call, and works collaboratively with internal and external team members. Develop ETL using Ab Initio, Databases: MS SQL server, Bigdata, text/excel files. Should engage in the intake/release/change/incident/problem management processes. Should be able to prioritize and drive all the relevant support priorities including (Incident, change, problem, knowledge, engagement with projects ). Develop and document a high-level Conceptual Data Process Design for review by Architect, data analysts that will serve as a basis for writing ETL code and designing test plans. Thoroughly unit test ETL code to ensure error free/efficient delivery. Analyze several aspects of code prior to release to ensure that it will run efficiently and can be supported in the production environment. Should be able to provide data modeling solutions. Kindly share your updated resume to AISHWARYAG5@hexaware.com
Posted 6 days ago
7.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Posted 1 week ago
6.0 - 11.0 years
20 - 27 Lacs
Pune
Work from Office
Mandatory Primary Skills: Python, Pyspark & SQLSecondary Skills: Any Cloud exp, DWH, BI tools (Qlik, PowerBI etc.)
Posted 1 week ago
6.0 - 11.0 years
20 - 27 Lacs
Pune
Work from Office
Mandatory Primary Skills: Python, Pyspark & SQLSecondary Skills: Any Cloud exp, DWH, BI tools (Qlik, PowerBI etc.)
Posted 1 week ago
8.0 - 10.0 years
10 - 14 Lacs
Hyderabad
Work from Office
What you will do Lets do this. Lets change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. This role involves defining the architecture vision, creating roadmaps, and ensuring that IT strategies align with business goals. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the enterprise/domain/solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with collaborators to gather and analyze requirements, ensuring that solutions meet both business and technical needs Evaluate and recommend technologies and tools that best fit the solution requirements Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Contribute to a program vision while advising and articulating program/project strategies on enabling technologies Provide guidance on application and integration development best practices, Enterprise Architecture standards, functional and technical solution architecture & design, environment management, testing, and Platform education Drive the creation of application and technical design standards which leverage best practices and effectively integrate Salesforce into Amgens infrastructure Troubleshoot key product team implementation issues and demonstrate ability to drive to successful resolution. Lead the evaluation of business and technical requirements from a senior level Review releases and roadmaps from Salesforce and evaluate the impacts to current applications, orgs, and solutions. Identification and pro-active management of risk areas and commitment to seeing an issue through to complete resolution Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelors degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Experience with SFDC Service Cloud/ Health Cloud in a call center environment Strong architectural design and modeling skills Extensive knowledge of enterprise architecture frameworks and methodologies Experience with system integration, IT infrastructure Experience directing solution design, business processes redesign and aligning business requirements to technical solutions in a regulated environment Experience working in agile methodology, including Product Teams and Product Development models Extensive hands-on technical and solution implementation experience with the Salesforce Lightning Platform, Sales Cloud and Service Cloud, demonstrating positions of increasing responsibility and management/mentoring of more junior technical resources Demonstrable experience and ability to develop custom configured, Visualforce and Lightning applications on the platform. Demonstrable knowledge of the capabilities and features of Service Cloud and Sales Cloud. Demonstrable ability to analyze, design, and optimize business processes via technology and integration, including leadership in guiding customers and colleagues in rationalizing and deploying emerging technology for business use cases A thorough understanding of web services, data modeling, and enterprise application integration concepts, including experience with enterprise integration tools (ESBs and/or ETL tools), and common integration design patterns with enterprise systems (e.g. CMS, ERP, HRIS, DWH/DM) Demonstrably excellent, context-specific and adaptive communication and presentation skills across a variety of audiences and situations; established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Preferred Qualifications: Strong solution design and problem-solving skills Solid understanding of technology, function, or platform Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Professional Certifications Salesforce Admin Advanced Admin Platform Builder Salesforce Application Architect (Mandatory) Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills
Posted 1 week ago
4.0 - 8.0 years
0 - 3 Lacs
Pune, Chennai, Bengaluru
Hybrid
Primary skill - MSSQLSEVER , TSL, Performance Tuning, SSIS and DWH Experience - 6 years - years Location - Pune / Mumbai / Chennai / Bangalore Notice Period - Immediate joiners Proven experience and hands-on as a Microsoft BI Developer (SSIS ) Expert in SQL. Should be able to write complex, nested queries, stored procedures. Background in data warehouse design (e.g. dimensional modelling) and data mining Added advantage: knowledge on Master data services, Power BI and Tableau Basic understanding of Agile process is good to have Extensive experience in SQL Server database. Good knowledge of Stored Procedures, views, complex queries, functions.
Posted 1 week ago
4.0 - 9.0 years
5 - 10 Lacs
Chennai, Bengaluru
Work from Office
Job Purpose: We are seeking an experienced Azure Data Engineer with over 4 to 13 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in ETL, Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems with strong communication skills. Requirements: We are seeking an experienced Azure Data Engineer with over 4 to 13 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in ETL, Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems with strong communication skills. The ideal candidate should have: Key Responsibilities: Create Data lakes from scratch, configure existing systems and provide user support Understand different datasets and Storage elements to bring data Have good knowledge and work experience in ADF, Synapse Data pipelines Have good knowledge in python, Py spark and spark sql Implement Data security at DB and data movement layers Should have experience in ci/cd data pipelines Work with internal teams to design, develop and maintain software Qualifications & Key skills required: Expertise in Datalakes, Lakehouse, Synapse Analytics, Data bricks, Tsql, sql server, Synapse Db, Data warehouse Hands-on experience in ETL, ELT, handling large volume of data and files. Working knowledge in json, parquet, csv, xl, structured, unstructured data and other data sets Exposure to any Source Control Management, like TFS/Git/SVN Understanding of non-functional requirements Should be proficient in Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems Experience in Azure Data Fabric, MS Purview, MDM tools is an added advantage A good team player and excellent communicator
Posted 1 week ago
6.0 - 11.0 years
15 - 20 Lacs
Hyderabad, Pune, Chennai
Work from Office
Hiring For Top IT Company- Designation:ETL Tester Skills:ETL Testing + Data warehouse + Snowflakes + Azure Location:Bang/Hyd/Pune/Chennai Exp: 5-10 yrs Call: Nisha:8875876654 Afreen:9610352987 Garima:8875813216 Kajal:8875831472 Team Converse
Posted 1 week ago
8.0 - 13.0 years
22 - 37 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Senior Professional Scrum Master- Experience- 8 to 12 years of IT Experience, Location- Pune, Nagpur, Hyderabad, Bangalore, Mumbai. - Certified Scrum Master - Excellent communication skills. - Well versed with Agile methodology to plan, manage, and deliver solutions - Conducts all scrum ceremonies and aids in story/task creation and estimation. - Identifies and manages issues, risks, and action items. - Schedules and facilitates all scrum events and decision-making processes. - Monitors progress and helps teams to make improvements. - Strong technical background preferably in Datawarehouse and Healthcare domain.
Posted 1 week ago
7.0 - 12.0 years
10 - 20 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Hexaware technologies is hiring for Senior Data Engineer Primary Skill - ETL, SSIS, SQL DWH Notice Period - Immediate/ Early joiners preferred Location - Chennai, Mumbai, Pune, Bangalore Total experience required - 6 to 12yrs If interested, Kindly share your updated resume with below required details. Full name: Total IT exp: Rel Exp in (ETL, SSIS, DWH): Current location: Current CTC: Exp CTC: Notice Period (Mention LWD, If serving): Job Description: 5+ Experience with developing Batch ETL/ELT processes using SQL Server and SSIS ensuring all related data pipelines meets best-in-class standards offering high performance. 5+ experience writing and optimizing SQL queries and stored Procedures for Data Processing and Data Analysis. 5+ years of Experience in designing and building complete data pipelines, moving and transforming data for ODS, Staging, Data Warehousing and Data Marts using SQL Server Integration Services (ETL) or other related technologies. 5 + years' experience Implementing Data Warehouse solutions (Star Schema, Snowflake schema) for reporting and analytical applications using SQL Server and SSIS ,or other related technologies. 5+ years' Experience with large-scale data processing and query optimization techniques using TSQL. 5+ years' Experience with implementing audit, balance and control mechanism in data solutions 3+ year experience with any source control repos like GIT, TFVC or Azure DevOps , including branching and merging and implement CICD pipelines for database and ETL workloads. 2+ experience working with Python Pandas libraries to process semi structured data sets, and load them to SQL Server DB.
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-10 yrs Location: Gurugram/Bangalore/Pune Skill: Azure Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Databricks: Rel Exp in Pyspark: Rel Exp in DWH: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 2 weeks ago
8.0 - 13.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Work from Office
Consultant Data Engineer Tools & Technology : Snowflake, Snowsql, AWS, DBT, Snowpark, Airflow, DWH, Unix, SQL, Shell Scripting, Pyspark, GIT, Visual Studio, Service Now. Duties and Responsibility Act as Consultant Data Engineer Understand business requirement and designing, developing & maintaining scalable automated data pipelines & ETL processes to ensure efficient data processing and storage. Create a robust, extensible architecture to meet the client/business requirements Snowflake objects with integration with AWS services and DBT Involved in different type of data ingestion pipelines as per requirements. Development in DBT (Data Build Tool) for data transformation as per the requirements Working on multiple AWS services integration with Snowflake. Working with integration of structured data & Semi-Structure data sets Work on Performance Tuning and cost optimization Work on implementing CDC or SCD type 2 Design and build solutions for near real-time stream as well as batch processing. Implement best practices for data management, data quality, and data governance. Responsible for data collection, data cleaning & pre-processing using Snowflake and DBT Investigate production issues and fine-tune our data pipelines Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery. Co-ordinates and Support software developers, database architects, data analysts and data scientists on data initiatives Orchestrate the pipeline using Airflow Suggests improvements to processes, products and services. Interact with users, management, and technical personnel to clarify business issues, identify problems, and suggest changes/solutions to business and developers. Create technical documentation on confluence to aim knowledge sharing. -Associate Data Engineer Tools & Technology : Snowflake, DBT, AWS, Airflow, ETL, Datawarehouse, shell Scripting, SQL, Git, Confluence, Python • Duties and Responsibility Act as offshore Data engineer and enhancement & testing. Design and build solutions for near real-time stream processing as well as batch processing. Development in snowflake objects with there unique features implemented Implementing data integration and transformation workflows using DBT Integration with AWS services with snowflake Participate in implementation plan, respond to production issues Responsible for data collection, data cleaning & pre-processing Experience in developing UDF, Snowflake Procedures, Streams, and Tasks. Involved in troubleshooting customer data issues, manual load if any data missed, data duplication checking and handling with RCA Investigate Productions jobs failure with including investigation till find out RCA. Development of ETL processes and data integration solutions. Understanding the business needs of the client and provide technical solution Monitoring the overall functioning of process, identifying improvement area and implement help of scripting. Handling major outages effectively along with effective communication to business, users & development partners. Defines and creates Run Book entries and knowledge articles based on incidents experienced in the production - Associate Engineer • Tools and Technology: UNIX, ORACLE, Shell scripting, ETL, Hadoop, Spark, Sqoop, Hive, Control-m, Techtia, SQL, Jira, HDFS, Snowflake, DBT, AWS • Duties and Responsibility Worked as an Senior Production /Application Support Engineer Working as Production support member for Loading, Processing and Reporting of files and generating Reports. Monitoring multiple Batches, Jobs, Processes and analyzing issue related the job failures and Handling FTP failure, Connectivity issue of Batch/Job failure. Performing data analysis on files and generating files and sending files to destination server depends on functionality of job. Creating Shell Script for automating the daily task or Service Owner Requested. Involved in tuning the Jobs to improve performance and performing daily checks. Coordinating with Middleware, DWH, CRM and most of teams in case of any issue of CRQ. Monitoring the overall functioning of process, identifying improvement area and implement help of scripting. Involved in tuning the Jobs to improve performance and raising PBI after approval from Service owner. Involved in performance improvement Automation activities to decrees manual workload Data ingestion from RDBMS system to HDFS/Hive through SQOOP Understand customer problems and provide appropriate technical solutions. Handling major outages effectively along with effective communication to business, users & development partners. Coordinating with Client, On- Site persons and joining the bridge call for any issues. Handling daily issues based on application and jobs performance.
Posted 2 weeks ago
10.0 - 16.0 years
0 - 1 Lacs
Noida
Work from Office
Job Title / Designation : Database Manager Location : Noida Job Summary: The role involves designing and developing solutions to support the business needs. Optimizing and tuning existing programs and developing new routines will be an integral part of the profile. Key Responsibility Areas: • Architect, Design and Develop solutions to support business requirements. • Use skill sets to Analyze and manage a variety of database environments such as Oracle, Postgres, Cassandra, MySql, Graph DB, etc • Provide optimal design of database environments, analysing complex distributed production deployments, and making recommendations to optimize performance. • Work closely with programming teams to deliver high quality software. • Provide innovative solutions to complex business and technology problems. • Propose best solution in Logical and Physical Data Modelling. • Perform Administration tasks including DB resource planning and DB tuning. • Mentor and train junior developers, lead & manage teams. Qualification: B.Tech. Experience: 10+ years of experience in a Data Engineering role. Bachelor's degree in Computer Science or related experience. Skill Sets / Requirements: • Experience of designing/architect database solutions • Experience with multiple RDBMS and NoSql databases of TB data size preferably Oracle, PostgreSQL, Cassandra and Graph DB. • Must be well versed in PL/SQL & PostgreSQL and Strong Query Optimization skills. • Expert knowledge in DB installation, configuration, replication, upgradation, security and HADR set up. • Experience in database deployment, performance and / troubleshooting issues. • Knowledge of scripting languages (such and Unix, shell, PHP). • Advanced knowledge of PostgreSQL will be preferred. • Experience working with Cloud Platforms and Services • Experience with migrating database environments from one platform to another • Ability to work well under pressure • Experience with big data technologies and DWH is a plus
Posted 2 weeks ago
10.0 - 16.0 years
25 - 27 Lacs
Chennai
Work from Office
We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in AWS/Azure, DWH, and ETL. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools)
Posted 3 weeks ago
8.0 - 10.0 years
0 - 1 Lacs
Noida
Work from Office
Strong understanding of customer data models, behavioural analytics, segmentation, and machine learning models. Experience with APIs integration, real-time event processing, and data pipelines. Prior experience working in ETL and DWH is a must. Prior Experience in designing and implementation with cloud environments GCP and data platforms (e.g., Snowflake, BigQuery). Experience developing customer-facing user interfaces with BI Tools like Google Looker, power BI or any other open source Agile delivery experience, Self-motivated and creative, Good communication and interpersonal skills Motivated self-starter, able to change directions quickly when priorities shift and quickly think through problems to design and deliver solutions Segment CDP platform developer, or/and Minimum relevant experience required would be 8-10 years with Btech/MCA/Mtech.
Posted 3 weeks ago
6.0 - 11.0 years
20 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.
Posted 3 weeks ago
5 - 10 years
25 - 32 Lacs
Mohali, Hyderabad
Hybrid
Design and implement Azure BI infrastructure, ensure overall quality of delivered solution. Develop analytical & reporting tools, promote and drive adoption of developed BI solutions. Demonstrated experience using development tools such as Azure SQL database, Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure DevOps. Proficient in TSQL and python for data analysis. Experience with development methodologies including Agile, DevOps, and CICD patterns
Posted 2 months ago
8 - 13 years
22 - 35 Lacs
Bengaluru
Work from Office
Natwest group is hiring for Software Engineers. Job Spec Experienced in Python, PySpark, SQL, DWH, and AWS Experience in AWS arch, using EMR , EC2, S3, Lambda and Glue Experience in using tools such a Apache Airflow, Anaconda and sagemaker Work in an agile environment with participation in daily stand-ups/scrum Ability to produce coherent documentation to support development processes Strong communication skills with stakeholders, developers and users.
Posted 2 months ago
8 - 15 years
10 - 22 Lacs
Bengaluru
Work from Office
Company Name: NatWest Group Experience: 8+ Years Location: Bangalore Interview Mode: Virtual Interview Rounds: 2-3 Rounds Notice Period: Any Job Requirements : Experienced in Python, PySpark, SQL, DWH, and AWS Experience in AWS arch, using EMR , EC2, S3, Lambda and Glue Experience in using tools such a Apache Airflow, Anaconda and sagemaker Work in an agile environment with participation in daily stand-ups/scrum Ability to produce coherent documentation to support development processes Strong communication skills with stakeholders, developers and users.
Posted 2 months ago
3 - 5 years
40 - 45 Lacs
Bhubaneshwar, Kochi, Kolkata
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 2 months ago
7 - 12 years
20 - 35 Lacs
Hyderabad
Hybrid
Job Description: We are looking for a highly skilled Lead Data Engineer with 8-12 years of experience to lead a team in building and managing advanced data solutions. The ideal candidate should have extensive experience with SQL, Teradata, Ab-Initio, and Google Cloud Platform (GCP). Key Responsibilities: Lead the design, development, and optimization of large-scale data pipelines,ensuring they meet business and technical requirements. Architect and implement data solutions using SQL, Teradata, Ab-Initio, and GCP, ensuring scalability, reliability, and performance. Mentor and guide a team of data engineers in the development and execution of ETL processes and data integration solutions. Collaborate with cross-functional teams (e.g., data scientists, analysts, productmanagers) to define data strategies and deliver end-to-end data solutions. Take ownership of end-to-end data workflows, from data ingestion to transformation, storage, and accessibility. Lead performance tuning and optimization efforts for complex SQL queries and Teradata database systems. Design and implement data governance, quality, and security best practices to ensure data integrity and compliance. Manage the migration of legacy data systems to cloud-based solutions on Google Cloud Platform (GCP). Ensure continuous improvement and automation of data pipelines and workflows. Troubleshoot and resolve issues related to data quality, pipeline performance, and system integration. Stay up-to-date with industry trends and emerging technologies to drive innovation and improve data engineering practices within the team. Required Skills: 8-12 years of experience in data engineering or related roles. Strong expertise in SQL, Teradata, and Ab-Initio. In-depth experience with Google Cloud Platform (GCP), including tools like BigQuery,Cloud Storage, Dataflow, etc. Proven track record of leading teams and projects related to data engineering and ETL pipeline development. Experience with data warehousing and cloud-native storage solutions. Strong analytical and problem-solving skills. Experience in setting up and enforcing data governance, security, and compliance standards. Preferred Skills: Familiarity with additional cloud services (AWS, Azure). Experience with data modeling and metadata management. Knowledge of big data technologies like Hadoop, Spark, etc. Strong communication skills and the ability to collaborate effectively with both technical and non-technical teams.
Posted 2 months ago
15 - 17 years
17 - 19 Lacs
Pune
Work from Office
Position Overview We are seeking a dynamic and experienced Enterprise Solution Architect to lead the design and implementation of innovative solutions that align with our organization's strategic objectives. The Enterprise Solution Architect will play a key role in defining the architecture vision+ establishing technical standards+ and driving the adoption of best practices across the enterprise. The ideal candidate will have a deep understanding of enterprise architecture principles+ business processes+ and technology trends+ with a focus on delivering scalable+ flexible+ and secure solution Responsibilities • Drive client conversations+ solutions and build strong relationships with client+ acting as a trusted advisor and technical expert. Experienced in laying down Architectural roadmap+ guidelines and High Level Design covering E2E lifecycle of data value chain from ingestion+ integration+ consumption (visualization+ AI capabilities)+ data governance and non-functionals (incl. data security) Experienced in delivering large scale data platform implementations for Telecom clients. Must have Telecom domain understanding. Experienced in implementation of data applications and platform on GCP. Execution of a comprehensive data migration strategy for our telecom client+ involving multiple source systems to GCP. Deep dive into client requirements to understand their data needs and challenges. Proactively propose solutions that leverage GCP's capabilities or integrate with external tools for optimal results. Spearhead solution calls with the client+ translating complex data architecture and engineering concepts into clear+ actionable plans for data engineers. Demonstrate flexibility and adaptability to accommodate evolving needs. Develop a robust data model for the telecom client+ ensuring data is organized+ consistent+ and readily available for analysis. Leverage your expertise in Data+ AI+ and ML to create a future-proof blueprint for the client's data landscape+ enabling advanced analytics and insights generation. Develop architectural principles+ standards+ and guidelines to ensure consistency+ interoperability+ and scalability across systems and applications. Lead the design and implementation of end-to-end solutions that leverage emerging technologies and industry best practices to address business challenges and opportunities. Conduct architectural reviews and assessments to validate design decisions+ identify risks+ and recommend mitigation strategies. Collaborate with vendors+ partners+ and external consultants to evaluate and select technology solutions that meet business requirements and align with enterprise architecture standards. Drive the adoption of cloud computing+ microservices architecture+ API management+ and other emerging technologies to enable digital transformation and innovation. Communicate the enterprise architecture vision+ principles+ and roadmap to stakeholders at all levels of the organization+ and advocate for architectural decisions and investments. Qualifications • Bachelor's degree in Computer Science+ Engineering+ or a related field. Total experience of 18+ years on data analytics implementations. Minimum 10+ years of extensive experience as a Principal Solution Architect or similar senior role. Proven success in leading large-scale data migrations+ particularly to GCP. In-depth knowledge of data architecture principles and best practices. Strong understanding of data modeling techniques and the ability to create efficient data models. Experience working with GCP and its various data management services (e.g.+ BigQuery+ Cloud Storage+ Dataflow+ dbt). Experience with at least one programming language commonly used in data processing (e.g.+ Python+ Java). A demonstrable understanding of Data Science+ Artificial Intelligence+ and Machine Learning concepts.
Posted 3 months ago
5 - 10 years
20 - 25 Lacs
Bengaluru
Work from Office
Job Description AWS Data engineer Hadoop Migration We are seeking an experienced AWS Principal Data Architect to lead the migration of Hadoop DWH workloads from on-premise to AWS EMR. As an AWS Data Architect, you will be a recognized expert in cloud data engineering, developing solutions designed for effective data processing and warehousing requirements of large enterprises. You will be responsible for designing, implementing, and optimizing the data architecture in AWS, ensuring highly scalable, flexible, secured and resilient cloud architectures solving business problems and helps accelerate the adoption of our clients data initiatives on the cloud. Key Responsibilities: Lead the migration of Hadoop workloads from on-premise to AWS-EMR stack. Design and implement data architectures on AWS, including data pipelines, storage, and security. Collaborate with cross-functional teams to ensure seamless migration and integration. Optimize data architectures for scalability, performance, and cost-effectiveness. Develop and maintain technical documentation and standards. Provide technical leadership and mentorship to junior team members. Work closely with stakeholders to understand business requirements, and ensure data architectures meet business needs. Work alongside customers to build enterprise data platforms using AWS data services like Elastic Map Reduce (EMR), Redshift, Kinesis, Data Exchange, Data Sync, RDS , Data Store, Amazon MSK, DMS, Glue, Appflow, AWA Zero-ETL, Glue Data Catalog, Athena, Lake Formation, S3, RMS, Data Zone, Amazon MWAA, APIs Kong Deep understanding of Hadoop components, conceptual processes and system functioning and relative components in AWS EMR and other AWS services. Good experience on Spark-EMR Experience in Snowflake/Redshift Good idea of AWS system engineering aspects of setting up CI-CD pipelines on AWS using Cloudwatch, Cloudtrail, KMS, IAM IDC, Secret Manager, etc Extract best-practice knowledge, reference architectures, and patterns from these engagements for sharing with the worldwide AWS solution architect community Basic Qualifications: 10+ years of IT experience with 5+ years of experience in Data Engineering and 5+ years of hands-on experience in AWS Data/EMR Services (e.g. S3, Glue, Glue Catalog, Lake Formation) Strong understanding of Hadoop architecture, including HDFS, YARN, MapReduce, Hive, HBase. Experience with data migration tools like Glue, Data Sync. Excellent knowledge of data modeling, data warehousing, ETL processes, and other Data management systems. Strong understanding of security and compliance requirements in cloud. Experience in Agile development methodologies and version control systems. Excellent communication an leadership skills. Ability to work effectively across internal and external organizations and virtual teams. Deep experience on AWS native data services including Glue, Glue Catalog, EMR, Spark-EMR, Data Sync, RDS, Data Exchange, Lake Formation, Athena, AWS Certified Data Analytics – Specialty. AWS Certified Solutions Architect – Professional. Experience on Containerization and serverless computing. Familiarity with DevOps practices and automation tools. Experience in Snowflake/Redshift implementation is additionally preferred. Preferred Qualifications: Technical degrees in computer science, software engineering, or mathematics Cloud and Data Engineering background with Migration experience. Other Skills: A critical thinker with strong research, analytics and problem-solving skills Self-motivated with a positive attitude and an ability to work independently and or in a team Able to work under tight timeline and deliver on complex problems. Must be able to work flexible hours (including weekends and nights) as needed. A strong team player
Posted 3 months ago
2 - 7 years
6 - 16 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 3 months ago
5 - 7 years
22 - 25 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Greetings from InfoVision...!!! We, InfoVision, looking forward to fill the position of Data Engineer with the main skill-set focus on Data Pipelines, Azure Databricks, Pyspark/Python, Azure DevOps, DWH, Azure Data Lake Storage Gen2. Company profile: Infovision, founded in 1995, is a leading global IT services company offering enterprise digital transformation and modernization solutions across business verticals. We partner with our clients in driving innovation, rethinking workflows, and transforming experiences so businesses can stay ahead in a rapidly changing world. We help shape a bold new area or era of technology led disruption accelerating digital with quality, agility and integrity. We have helped more than 75 global leaders across Telecom, Retail, Banking, Healthcare and Technology Industries deliver excellence for their customers. InfoVisions global presence enables us to offer offshore, near shore and onshore solutions for our customers. With our world-class infrastructure for employees and people-centric policies, InfoVision is one of the highest-rated digital services companies in Glassdoor ratings. We encourage our employees to thrive in and are committed to providing a work environment that fosters an entrepreneurial mindset, nurtures inclusivity, values integrity and accelerates your career by creating opportunities for promising growth. Designation: Data Engineer Experience Required: 5-7 Years Job Location: Hyderabad, Chennai, Coimbatore, Pune, Bangalore Opportunity is Fulltime and Hybrid model work As a Data Engineer in our team, you will be responsible for assessing complex new data sources and quickly turning these into business insights. You also will support the implementation and integration of these new data sources into our Azure Data platform. Responsibilities: You are detailed reviewing and analyzing structured, semi-structured and unstructured data sources for quality, completeness, and business value. You design, architect, implement and test rapid prototypes that demonstrate the value of the data and present them to diverse audiences. You participate in early state design and feature definition activities. Responsible for implementing robust data pipeline using Microsoft, Databricks Stack Responsible for creating reusable and scalable data pipelines. You are a Team-Player, collaborating with team members across multiple engineering teams to support the integration of proven prototypes into core intelligence products. You have strong communication skills to effectively convey complex data insights to non-technical stakeholders. You have experience working collaboratively in cross-functional teams and managing multiple projects simultaneously. Skills: Advanced working knowledge and experience with relational and non-relational databases. Experience building and optimizing Big Data pipelines , architectures, and datasets. Strong analytic skills related to working with structured and unstructured datasets. Hands-on experience in Azure Databricks utilizing Spark to develop ETL pipelines . Strong proficiency in data analysis, manipulation, and statistical modeling using tools like Spark, Python , Scala, SQL , or similar languages. Strong experience in Azure Data Lake Storage Gen2, Azure Data Factory , Databricks, Event Hub , Azure Synapse . Familiarity with several of the following technologies: Event Hub, Docker, Azure Kubernetes Service, Azure DWH , API Azure, Azure Function, Power BI , Azure Cognitive Services. Azure DevOps experience to deploy the data pipelines through CI/CD . Qualifications and Experience: Minimum 5-7 years of practical experience as Data Engineer. Bachelors degree in computer science, software engineering, information technology, or a related field. Azure cloud stack in-production experience. You can share your updated resume to the Email ID: Bojja.Chandu@Infovision.com along with below details. Full Name: Current Company: Payroll Company: Experience: Rel. Exp.: Current Location: Preferred Location: CTC: ECTC: Notice Period: Holding offers?: You can connect with me to my LinkedIn as well: https://www.linkedin.com/in/chandu-b-a48b2a142/ Regards, Chandu.B, InfoVision, Senior Executive - Talent Acquisition, Bojja.Chandu@Infovision.com
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2