Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
25 - 40 Lacs
kochi, bengaluru
Work from Office
Expertise in Tableau Experience integrating with Redshift or other DWH Platforms and understanding DWH concepts. Python or similar scripting languages Experience with AWS Glue, Athena, or other AWS data services.Strong SQL skills, AWS QuickSight
Posted 1 day ago
3.0 - 5.0 years
3 - 6 Lacs
bengaluru
Hybrid
We are looking for a skilled SQL Developer with Power BI expertise to join our team. The ideal candidate will have a strong foundation in SQL and a proven ability to design and develop insightful dashboards and reports using Power BI. Key Responsibilities: Power BI Development: Design, develop, and maintain interactive dashboards and paginated reports Build robust data models, measures, and calculated columns using DAX Implement role-level security and performance-tune visuals and datasets Database Development: Write and optimize complex queries, stored procedures, views, and functions (MS SQL Server & PostgreSQL) Collaborate on data warehouse architecture using star/snowflake schemas Load, transform, and validate data from multiple sources as needed Develop and manage ETL packages , ensuring data quality, consistency, and lineage Required Skills: 2+ years of hands-on experience in Power BI development Strong command of Microsoft SQL Server and PostgreSQL Proficiency in DAX and Power BI data modeling Experience with ETL design , implementation, and data validation Understanding of data warehousing concepts and dimensional modeling Attention to detail and strong problem-solving skills Nice to Have: Experience with large datasets and data performance tuning Exposure to financial or market data analytics
Posted 1 day ago
8.0 - 13.0 years
0 Lacs
karnataka
On-site
Role Overview: You will be responsible for designing, building, and maintaining robust data models and pipelines using Azure Synapse and MS SQL to support analytics, reporting, and performance needs. Your role will involve implementing scalable ingestion frameworks for structured and unstructured data into the data warehouse (DWH), ensuring data integrity and consistency. Additionally, you will use Python for data processing, automation, and analytics, following best practices for maintainability and performance. Managing tasks and progress using Jira will be crucial to ensure timely delivery of high-quality, production-ready data solutions aligned with business goals. Collaboration with stakeholders to gather requirements, provide updates, and deliver solutions will also be a key aspect of your responsibilities. Supporting BI teams with data readiness for tools like Power BI and validating data accuracy and completeness will be part of your day-to-day tasks. Key Responsibilities: - Design, build, and maintain robust data models and pipelines using Azure Synapse and MS SQL. - Implement scalable ingestion frameworks for structured and unstructured data into the data warehouse (DWH). - Use Python for data processing, automation, and analytics. - Manage tasks and progress using Jira for timely delivery of high-quality data solutions. - Work closely with stakeholders to gather requirements, provide updates, and deliver solutions. - Support BI teams with data readiness for tools like Power BI. - Validate data accuracy and completeness, optimize workflows through automation, and ensure adherence to governance policies and regulatory standards. Qualification Required: - Proven experience in developing and maintaining data models and pipelines in Azure environments, with strong proficiency in MS SQL and Python for data engineering and analytics tasks. - Solid understanding of data warehousing concepts, data validation, cleansing, and quality assurance processes. - Familiarity with Jira for agile project and task management, and the ability to work both independently and collaboratively in a team environment. - Strong problem-solving skills and a delivery-oriented mindset, with a focus on producing high-quality, scalable solutions. - Excellent communication and stakeholder engagement skills to effectively gather requirements and provide updates. Company Details: N/A,
Posted 5 days ago
6.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
Job Description: Greetings of the day! We are looking for a skilled Azure Databricks professional to join our team at Hexaware Technologies for the position of Azure Databricks. In this role, you will be responsible for working on development projects within Azure Databricks, with a focus on Pyspark coding and SQL expertise. **Key Responsibilities:** - Utilize your 6-12 years of IT experience, preferably in cloud technology, with a minimum of 4 years in Azure Databricks on development projects. - Be 100% hands-on in Pyspark coding and demonstrate strong SQL expertise in writing advanced/complex SQL queries. - Prior experience in Data Warehousing is a must for this role. - Advantageous to have programming experience using Python. - Work on data ingestion, preparation, integration, and operationalization techniques to meet data requirements effectively. - Understand system architecture involving Data Lakes, Data Warehouses, and Data Marts. - Take ownership of end-to-end development processes, including coding, testing, debugging, and deployment. - Excellent communication skills are essential for this role. **Requirements:** - Total Experience: - Relevant Experience (Azure Databricks): - Pyspark: - Python: - SQL: - DWH: - Current CTC: - Expected CTC: - Notice Period (negotiable): - Highest Qualification: - Current Location: - Preferred Location: If you are interested in this opportunity and meet the qualifications mentioned above, please fill in the details requested to process your candidature. Immediate to 30 days notice period is preferred for this position. Joining Locations: Bangalore/Chennai/Mumbai/Pune/Noida We look forward to welcoming a professional like you to our team! Regards, Shweta Srivastava Deputy Manager Recruitment Email: ShwetaS3@hexaware.com Website: www.hexaware.com,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining ITSource Technologies Limited, a client-oriented IT services, BPO, and IT staffing company known for its commitment to excellence. With a track record of over 22 years, we specialize in Enterprise Applications, Big Data, Staffing, BI, Cloud, and Web Solutions. Our industry reputation is built on a customer-centric approach and exceptional talent management capabilities. As an AWS Data Architect based in Pune, this full-time on-site role will involve overseeing data governance, data architecture, data modeling, Extract Transform Load (ETL), and data warehousing processes. Your responsibilities will include applying data modeling techniques to address new business requirements, utilizing SQL, Advanced SQL, and NoSQL databases, managing Data Warehousing and ETL tasks, engaging with customers to grasp business needs, and working with Tableau and Power BI tools. To excel in this role, you should possess skills in Data Governance, Data Architecture, Data Modeling, and ETL processes. Strong analytical and problem-solving abilities are essential, as well as familiarity with AWS services and infrastructure. Effective communication and collaboration skills are key, along with a Bachelor's degree in Computer Science or a related field. An AWS certification would be considered a plus.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
We are seeking an experienced Data Architect to become a valuable member of our diverse team. In this role, your primary focus will be on delivering top-notch, scalable solutions by utilizing your expertise in Gen AI solutions, cloud technologies, and system design to exceed customer expectations. Your responsibilities will include designing and developing architectural concepts, creating technical roadmaps, and ensuring that software architecture aligns with business requirements. You will be responsible for setting up Continuous Integration, analyzing large databases on scalable cloud infrastructures, and developing prototypes using big-data technologies. As a Data Architect, you will also be expected to demonstrate cloud expertise by developing and deploying applications on leading cloud platforms, providing technical leadership, coaching team members on software design approaches, and leading efforts in application integration with PLM systems like SAP. To qualify for this role, you should hold a BE / B. Tech / ME / M. Tech degree with a minimum of 8 years of experience in data projects, including at least 2 years as a Data Architect. Your desired knowledge and experience should include a strong understanding of Data Pipelines, DevOps, Data Analysis, Data Modeling, Data Warehouse Design, Data Integration patterns, and various data management technologies. Additionally, you should possess expertise in technologies such as Python/JAVA, NO SQL DB, ETL and DWH concepts, SQL, Azure DevOps, Docker, Kubernetes, REST API, and have hands-on experience in Azure Cloud services. Knowledge of Machine Learning & Deep Learning, fine-tuning pre-trained models, model deployment, API development & integration, and event-driven architecture will be advantageous. Soft skills and other capabilities required for this role include excellent problem-solving and decision-making skills, effective communication abilities, the capacity to work independently, self-motivation, and strong teamwork skills to lead and motivate team members technically.,
Posted 1 week ago
3.0 - 8.0 years
5 - 15 Lacs
hyderabad
Hybrid
Experience: 3 to 10 years Location Preference: Hyderabad Hands-on experience with Snowflake data warehousing Strong proficiency in SQL for data manipulation and analysis Working knowledge of Snowpipe for real-time data ingestion Experience with data pipelines, ETL processes, and cloud platforms If you're ready to take the next step in your data career, and showcase your skills please email , our team is eager to meet passionate professionals like you! For any queries, feel free to reach out at priyanka.tiwari@brillio.com
Posted 1 week ago
4.0 - 8.0 years
20 - 27 Lacs
noida, bengaluru
Work from Office
Looking for Relevant work experience SSIS, SSAS, DW, Data Analysis and Business Intelligence. Strong in Data Warehousing, Business Intelligence and Dimensional Modelling concepts with experience in Designing, developing & maintaining ETL,
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
mumbai, bengaluru, delhi / ncr
Hybrid
We are Hiring: Data Engineers Enterprise Data Platforms & Analytics (50+ Openings | 3 - 20 Yrs Exp | Multiple Levels) Location: Mumbai / Pune / Chandigarh / Bengaluru / Gurugram / Noida / New Delhi / Kolkata / Hyderabad . Chennai ( Remote and Hybrid options are available for some roles) Industry: Fortune 500 Client Projects (Staffing via Hatchtra Innotech Pvt. Ltd.) Employment Type: Full-Time / Contract (C2H) About us: Hatchtra is a leading staffing and workforce solutions company, trusted by Fortune 500 organizations and global enterprises to build their technology teams. When you join us, you'll work directly with our Fortune 500 client teams that power mission-critical systems worldwide. About the Role We are looking for Data Engineers (3-20 years) to design, build, and maintain scalable data pipelines and enterprise data platforms. This role is ideal for engineers passionate about big data, cloud data platforms, ETL/ELT processes, and analytics engineering to support global enterprises in their digital transformation journey. From data ingestion to advanced analytics enablement , you will play a critical role in turning raw data into actionable insights. Open Positions & Designations Associate Data Engineer (3-5 Yrs) Data Engineer / Senior Data Engineer (5-10 Yrs) Lead Data Engineer / Analytics Engineer (812 Yrs) Data Engineering Manager / Solutions Architect (1015 Yrs) Director – Data Platforms & Engineering (15–20 Yrs) Key Responsibilities Scale with Level Professional (3–5 Yrs) Design and implement ETL/ELT pipelines to integrate data from various sources. Build and optimize data warehouses and data lakes on cloud or on-prem platforms. Work with SQL, Python, or Scala for data processing and transformation. Implement data quality checks and validation frameworks. Support analytics teams by making clean, reliable data available. Develop basic automation scripts for data ingestion and reporting workflows. Mid-Level (5–10 Yrs) Lead data integration projects across structured, semi-structured, and unstructured data. Optimize pipelines for performance, scalability, and cost-effectiveness . Implement data modeling best practices for analytics and BI. Collaborate with data scientists, analysts, and product teams to enable ML/AI use cases. Deploy streaming data pipelines using Kafka, Kinesis, or Azure Event Hub. Manage and improve data governance, lineage, and metadata management . Senior/Leadership (10–20 Yrs) Define enterprise data engineering strategy and architecture standards . Lead multi-cloud data platform modernization initiatives (AWS, Azure, GCP). Build and manage global data engineering teams and delivery models. Partner with executives to deliver data-driven transformation programs . Oversee compliance, data security, and privacy frameworks (GDPR, HIPAA, SOC2). Drive innovation in real-time analytics, serverless data processing, and AI-driven data engineering . Skills & Tools Core Expertise: ETL/ELT Development, Data Modeling, Pipeline Automation, Data Warehousing Programming: SQL, Python, Scala, Java Data Platforms: Snowflake, BigQuery, Azure Synapse, Amazon Redshift, Databricks Orchestration: Apache Airflow, AWS Step Functions, Azure Data Factory, dbt Streaming: Kafka, Kinesis, Azure Event Hubs, Spark Streaming Big Data Tools: Apache Spark, Hadoop, Hive Cloud Expertise: AWS, Azure, GCP (multi-cloud deployments) DevOps for Data: Docker, Kubernetes, Terraform, CI/CD for DataOps Preferred Certifications: Google Professional Data Engineer AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Qualifications Bachelor’s/Master’s in Computer Science, Data Engineering, or related field. 3+ years of experience in data engineering, ETL/ELT, or data platform development. Strong proficiency in SQL, Python/Scala , and cloud-native data services. Experience in building scalable, secure, and automated pipelines . Proven leadership and strategic architecture experience for senior roles. Why Join Us? Contribute to enterprise-scale data transformation projects . Collaborate with Fortune 500 companies and global engineering teams. Exposure to cutting-edge cloud data platforms and big data technologies . Career growth, certifications, and flexible work options. How to Apply For quick consideration, please email your resume and include the desired position and experience level (e.g., “Data Engineer – Mid-Level”) in the subject line.
Posted 1 week ago
8.0 - 12.0 years
20 - 35 Lacs
hyderabad
Work from Office
Data Engineering Lead About the Role We are searching for a strategic, results-driven Data Engineering Lead to join our Data Science software engineering team. The ideal candidate will possess a robust technical background in data engineering, extensive hands-on experience with modern data architectures, and a proven ability to mentor and lead high-performing engineering teams. As a thought leader, you will design and implement scalable data solutions that empower our data scientists and drive impactful insights across the organization. Key Responsibilities Lead, mentor, and grow a team of data engineers, fostering technical excellence and professional development. Architect, develop, and maintain advanced ELT data pipelines using state-of-the-art tools and best practices. Collaborate closely with data scientists to understand analytical requirements and translate them into scalable, reliable data solutions. Oversee the management and optimization of distributed, column-oriented databases such as Vertica and Snowflake to ensure high performance and data integrity. Drive the implementation and continuous improvement of data workflow orchestration with tools like Apache Airflow. Conduct root cause analysis and lead troubleshooting efforts to resolve complex data issues, ensuring availability and quality of critical datasets. Champion best practices for data governance, security, and compliance throughout all engineering processes. Document data processes, workflows, and architectural decisions for transparency and knowledge sharing. Promote innovation by evaluating emerging technologies and integrating them into existing data platforms as appropriate. Required Qualifications Bachelors degree in computer science, Engineering, or a related field. 8+ years of progressive experience in data engineering, with at least 2 years in a technical leadership role. Expertise in Python and SQL for large-scale data manipulation and transformation. Extensive experience with cloud-based data platforms and services, particularly AWS. Deep understanding of data warehousing, dimensional modelling, and analytics frameworks. Hands-on experience with orchestration and workflow automation tools such as Apache Airflow and Jenkins. Proficiency in distributed, column-oriented databases (e.g., Vertica, Snowflake) and familiarity with CI/CD practices using Git or similar tools. Demonstrated ability to solve complex analytical problems and deliver high-quality, reliable data solutions at scale. Outstanding communication, collaboration, and organizational skills, with a track record of cross-functional partnership. Preferred Qualifications Master’s degree in computer science, Engineering, or a related discipline. Experience implementing data governance, security, and compliance at scale. Knowledge of Apache Spark for distributed data processing and Apache Kafka for real-time data streaming. Background in supporting data science workflows and familiarity with machine learning data needs. Join us to lead our data engineering efforts and enable the next generation of data science innovation.
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
As a Technical Consultant specializing in Informatica and Oracle, you will be responsible for understanding complex technical and architectural issues, as well as the implications associated with the chosen technical strategy. Your role will involve interacting with various levels of the technical and business community to seek approval from all stakeholders involved in the projects. To qualify for this position, you should hold a B.E./B.Tech. or M.C.A. in Computer Science from a reputed university with a minimum of 10 years of relevant industry experience. Your expertise should include demonstrating technical leadership in Informatica, Oracle, Unix Scripting, Perl, and scheduling tools such as Autosys/Control. Additionally, you should possess a sound knowledge of Database Design, Data Warehouse, Data Mart, Enterprise Reporting, and ODS concepts. Your responsibilities will include but are not limited to: - Demonstrating strong Oracle PLSQL/T-SQL experience - Managing complete project lifecycle execution from requirements analysis to Go Live, with exposure to Agile methodology being a plus - Producing design/technical specifications and proposing solutions for new projects - Collaborating with delivery managers, System/Business Analysts, and other subject matter experts to design solutions - Working closely with business and technology teams to ensure proposed solutions meet requirements - Developing and implementing standards, procedures, and best practices for data management and optimization - Guiding and mentoring junior team members in solution building and troubleshooting - Utilizing your strong communication skills to effectively liaise with stakeholders - Having knowledge of Fund accounting, Fund reporting, and derivative concepts In addition to the above, you should have exposure to building reporting solutions with Web focus/OBIEE. Overall, your role as a Technical Consultant will require a deep understanding of data modeling, data normalization, and performance optimization techniques. If you meet the requirements and have a strong background in Solution Architect, Informatica, Oracle, DWH, PL/SQL, Technical Architect, Unix Scripting, Perl, Autosys, Control-M, Data Modeling, Data Normalization, Performance Optimization, OBIEE, Webfocus, Fund Account, then we encourage you to apply for this exciting opportunity in the IT/Computers-Software industry.,
Posted 2 weeks ago
8.0 - 14.0 years
0 Lacs
karnataka
On-site
You should have a minimum of 8 years of experience in Data Modelling & Data Analysis. Your primary responsibilities will include metadata management, data modelling, and related tools such as Erwin or ER Studio. With an overall experience of 10+ years in IT, you must possess hands-on experience in relational, dimensional, and/or analytic practices using RDBMS, dimensional data platform technologies, ETL, and data ingestion. Your role will involve working with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts. Strong communication and presentation skills are essential as you will help the team implement business and IT data requirements through new data strategies and designs across all data platforms and tools. You will collaborate with business and application/solution teams to implement data strategies and develop conceptual/logical/physical data models. Defining and governing data modelling and design standards, tools, best practices, and related development for enterprise data models will be part of your responsibilities. You should have hands-on experience in modelling and mappings between source system data model and Datawarehouse data models. Being proactive and independent in addressing project requirements and articulating issues/challenges to reduce project delivery risks related to modelling and mappings is crucial. Proficiency in writing complex SQL queries is a must, and it would be beneficial to have experience in data modelling for NOSQL objects. Your expertise in Data Analysis & Structuring, Transformation Logic, SQL (Complex Queries), Stored procedures & functions, ETL Development, Snowflake (Basic understanding), and Reporting Integration will be valuable. Ideally, you should have experience in the Banking/Finance domain. iOSYS Software India Pvt. Ltd. is a leading provider of IT solutions, specializing in Digital Marketing, Mobile Applications, and IT Development Solutions. As an enabler of innovation in the mobility space, we are committed to building industry-leading mobile applications and responsive web design solutions. If you have a passion for data modelling, analytical skills, and a knack for problem-solving, we welcome you to join our dynamic team and contribute to our mission of delivering innovative IT solutions to our clients.,
Posted 2 weeks ago
3.0 - 5.0 years
40 - 45 Lacs
kochi, kolkata, bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate should have at least 5 years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is a great opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Key Responsibilities: - Minimum of 5+ years of hands-on experience in Data Engineering, focusing on Data Warehousing, Business Intelligence, and related technologies. - Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. - SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. - Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. - Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. - Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Experience using Power BI for data visualization and business intelligence reporting. - Experience with Fivetran for automated ELT pipelines. - Familiarity with Sigma Computing, Tableau, Oracle, and DBT. - Strong data analysis, requirement gathering, and mapping skills. - Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education: - Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
You will lead a team and build systems that solve complex business problems at scale. You will work closely with product managers to understand customer needs and translate those into technical requirements. Furthermore, you will build and maintain scalable data management and serving infrastructure on the cloud, managing terabytes of data and integrating cost effectively. Your role will involve building, leading, and managing a rockstar team, as well as taking end-to-end ownership of the engineering process. You will be responsible for designing a comprehensive target data architecture vision in collaboration with product, design, development, and NatCo teams to plan the transition towards the target vision. Your responsibilities also include implementing architecture concepts in product development and services, consulting and engaging with engineering teams to solve problems at their root cause, and creating deliverables that provide exceptional clarity and connections to business objectives. In the realm of data specialization, you are expected to possess comprehensive and expert knowledge with more than 5 years of experience in Data Management and Data Architectures (DWH, Spark, Lakehouse) along with an overall experience of 7-10 years. You should have a deep understanding of concepts and principles related to both Batch and Streaming data, API/Microservices architectures, Data warehouses, and Mesh Architectures. Strong expertise in Big Data Applications, Architecture best practices, and experience with legacy Data Warehouses and Big Data platforms are essential. Additionally, you should have exposure and understanding of decision engine concepts and the implementation of MLOps within an enterprise. Your experience in software engineering and programming languages like Java and Scala will be valuable. Proven track record of working with globally federated IT teams is expected, along with a sound understanding of the challenges involved and the ability to bring teams and people together to deliver a shared outcome. Data Modelling experience, including Information framework SID and Data Vault 2.0, is a plus. In terms of security and regulatory rigor, you should be experienced in implementing solutions that protect sensitive and regulated Customer data. Your expertise should cover Security Architecture concepts such as identity and access management, network isolation, transport protection, cloud security specifics, encryption & key management, and data privacy treatment techniques. Strong intrapersonal skills are crucial for facilitating difficult discussions within the team or with diverse senior stakeholders. You should be resilient and able to adapt yourself and teams to changing environments and business situations. Extensive experience in multicultural environments and fast-moving, highly competitive markets is desired.,
Posted 2 weeks ago
7.0 - 12.0 years
10 - 20 Lacs
pune, chennai, mumbai (all areas)
Hybrid
Hexaware technologies is hiring for Senior Data Engineer Primary Skill - ETL, SSIS, SQL & strong exp in DWH Notice Period - Immediate/ Early joiners preferred Location - Chennai, Mumbai, Pune, Bangalore Total experience required - 6 to 12yrs If interested to attend Face to face interview on 23rd AUG at chennai location, Kindly share your updated resume with below required details. Full name: Contact No: Total IT exp: Rel Exp in (ETL, SSIS, DWH): Current Org: Current location: Current CTC: Exp CTC: Notice Period (Mention LWD, If serving): Job Description: 5+ Experience with developing Batch ETL/ELT processes using SQL Server and SSIS ensuring all related data pipelines meets best-in-class standards offering high performance. 5+ experience writing and optimizing SQL queries and stored Procedures for Data Processing and Data Analysis. 5+ years of Experience in designing and building complete data pipelines, moving and transforming data for ODS, Staging, Data Warehousing and Data Marts using SQL Server Integration Services (ETL) or other related technologies. 5 + years' experience Implementing Data Warehouse solutions (Star Schema, Snowflake schema) for reporting and analytical applications using SQL Server and SSIS ,or other related technologies. 5+ years' Experience with large-scale data processing and query optimization techniques using TSQL. 5+ years' Experience with implementing audit, balance and control mechanism in data solutions 3+ year experience with any source control repos like GIT, TFVC or Azure DevOps , including branching and merging and implement CICD pipelines for database and ETL workloads. 2+ experience working with Python Pandas libraries to process semi structured data sets, and load them to SQL Server DB.
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
You will need to have 2 - 5 years of experience in testing within the capital market domain. It is essential to possess strong SQL knowledge and a good understanding of Data Warehousing concepts. Being an ISTQB Certified Tester would be an advantage for this role. You should be experienced in developing Test plans and Test Strategies. Excellent communication skills are a must for effective coordination in this position. Prior experience with Quality Centre is a mandatory requirement.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
As a member of the infrastructure team at FIS, you will play a crucial role in troubleshooting and resolving technical issues related to Azure and SQL Server. Your responsibilities will include developing data solutions, understanding business requirements, and transforming data from different sources. You will design and implement ETL processes, collaborate with cross-functional teams to develop scalable solutions, and maintain technical documentation. To excel in this role, a degree in Computer Science is preferred along with a minimum of 4 years of experience. You should have a proficient working knowledge of Azure, SQL, and ETL, as well as experience with programming languages. Familiarity with data warehousing, JSON, XML data structures, and working with APIs will be beneficial. At FIS, you will have the opportunity to learn, grow, and make an impact in your career. Our benefits include a flexible and creative work environment, a diverse and collaborative atmosphere, professional and personal development resources, opportunities for volunteering and supporting charities, as well as a competitive salary and benefits package. Please note that current and future sponsorship are not available for this position. FIS is committed to protecting the privacy and security of personal information processed to provide services to clients. Recruitment at FIS primarily follows a direct sourcing model, and resumes from recruitment agencies not on the preferred supplier list will not be accepted. FIS is not responsible for any fees related to such submissions.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The QA Analyst role at Merkle requires 3 to 6 years of experience and involves acting as the intermediary between the development team and clients to ensure that deliverables align with client expectations and requirements throughout the project life cycle. The analyst becomes a subject matter expert for the project. Key responsibilities include writing test plans, designing test cases, developing test SQL scripts, and testing procedures with a focus on DWH and Database testing. They must possess proficiency in SQL, ETL testing, debugging differences, translating requirements into test cases, and utilizing SQL language and Data Warehousing principles. Good communication skills are essential for effective stakeholder communication. Desired skills include familiarity with Cloud technologies, experience with JIRA and Agile, testing SOAP/API projects, stakeholder communication, and proficiency in Microsoft Office. Certification in AWS/AZURE/GCP or Snowflake Associate/Core is preferred. The QA Analyst is responsible for performing functional, regression, load, system, and user acceptance testing, creating test data, tracking and reporting defects using tools like JIRA, validating fixes, investigating data quality conditions, and communicating test status to the internal project team and clients. Candidates should hold a Bachelor's or Master's degree or an equivalent degree. The shift timings are 12 PM to 9 PM and/or 2 PM to 11 PM in the IST time zone. The position is based at DGS India - Mumbai - Thane Ashar IT Park, operating under the Merkle brand on a full-time permanent contract basis.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The Senior Analyst-ETL testing role requires a minimum of 5 years of experience in Testing, with proficiency in Snowflake, Azure Cloud, and DWH. Strong expertise in SQL queries is essential for this position. The responsibilities include analyzing and comprehending business requirements to develop test cases for functional and non-functional components. The role also involves identifying, preparing, documenting, and executing various types of test cases such as Smoke, functional, non-functional, Automation Regression, and E2E. Experience in working within agile teams and collaborating effectively with Product Owners, developers, and testers is crucial. The ideal candidate should be an adaptive team player capable of working both independently and collaboratively to meet aggressive timelines. Clear communication skills are required for articulating defects, documenting evidence in JIRA, and ensuring thorough re-testing until closure. Familiarity with JIRA/XRAY and experience in the Securities & Capital Market domain are preferred. The successful candidate should possess excellent communication skills and demonstrate proficiency in ETL Testing. The role requires 3-5 years of relevant experience. Join a dynamic and innovative team at Wipro, where reinvention is encouraged, and individuals are empowered to evolve their careers and skills. Embrace the opportunity to contribute to the digital transformation journey and realize your professional ambitions in a purpose-driven environment. Wipro welcomes applications from individuals with disabilities.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
Wipro Limited is a leading technology services and consulting company dedicated to creating innovative solutions that cater to the complex digital transformation requirements of clients. With a comprehensive range of capabilities in consulting, design, engineering, and operations, we assist clients in achieving their most ambitious goals and establishing sustainable businesses for the future. Our global presence spans over 65 countries with a workforce of more than 230,000 employees and business partners, enabling us to support our customers, colleagues, and communities in adapting to an ever-evolving world. We are currently seeking a Test Lead specializing in ETL Testing with a focus on data-centric testing. The ideal candidate should possess a minimum of 7 years of experience in ETL Testing and must have proficiency in Snowflake, Azure Cloud, and Data Warehouse technologies. Additionally, expertise in SQL queries is essential for this role. Key Responsibilities: - Interpret and comprehend business requirements to develop test cases for both functional and non-functional components - Create, document, and execute various types of tests including Smoke, functional, non-functional, Automation Regression, and End-to-End tests - Collaborate effectively with Product Owners, developers, and testers in agile teams - Demonstrate adaptability and teamwork to meet project timelines both as an individual contributor and within a team setting - Clearly articulate defects, document evidence in JIRA, and ensure thorough re-testing until resolution - Familiarity with JIRA/XRAY for issue tracking and management - Preferred experience in the Securities & Capital Market domain - Excellent communication skills to effectively convey information and ideas Join us at Wipro and be a part of our journey to reinvent the future. We are committed to evolving our business and industry, and we are looking for individuals who are inspired by the prospect of reinventing themselves, their careers, and their skills. At Wipro, you will have the opportunity to contribute to a purpose-driven organization that encourages you to shape your own reinvention. Realize your ambitions with us and become a part of a diverse and inclusive workforce. Applications from individuals with disabilities are especially encouraged. Visit www.wipro.com for more information on how you can be a part of our modern Wipro and contribute to the digital transformation landscape.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You should have strong expertise in Looker and LookML along with advanced SQL skills including experience in query optimization. Proficiency in Data Warehouse (DWH) concepts and BigQuery is also required. Your excellent communication and team leadership abilities will be essential for this role. As a candidate for this position, you will be responsible for demonstrating strong expertise in Looker and LookML, utilizing advanced SQL skills for query optimization, understanding DWH concepts and working with BigQuery. Additionally, your excellent communication and team leadership abilities will be crucial for effective collaboration within the team. You should hold a Bachelor's degree in Engineering with a focus on strong communication skills to excel in this role.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a member of the infrastructure team at FIS, you will play a crucial role in troubleshooting and resolving technical issues related to Azure and SQL Server. Your responsibilities will include developing data solutions, understanding business requirements, and transforming data from different sources. You will design and implement ETL processes and collaborate with cross-functional teams to ensure that solutions meet business needs. To excel in this role, you should have a degree in Computer Science, a minimum of 4 years of experience, and proficient working knowledge of Azure, SQL, and ETL. Any programming language skills will be an asset, along with working knowledge of Data Warehousing, experience with JSON and XML data structures, and familiarity with working with APIs. At FIS, we offer a flexible and creative work environment where you can learn, grow, and make a real impact on your career. You will be part of a diverse and collaborative atmosphere, with access to professional and personal development resources. Additionally, you will have opportunities to volunteer and support charities, along with competitive salary and benefits. Please note that current and future sponsorship are not available for this position. FIS is committed to protecting the privacy and security of all personal information processed to provide services to our clients. For specific information on how FIS safeguards personal information online, please refer to the Online Privacy Notice. Recruitment at FIS primarily operates on a direct sourcing model, with a small portion of hires through recruitment agencies. FIS does not accept resumes from agencies not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
The ideal candidate for this position should have a minimum of 4 years of experience in DWH & DB concepts, with a strong proficiency in SQL and stored procedures. Your responsibilities will include creating a TDM Strategy, utilizing TDM tools such as GenRocket, Optim, Informatica, CA TDM, Solix tools for test data creation, and developing test plans specific to DWH and business requirements. You should have a solid understanding of data models, data mapping documents, ETL design, and ETL coding. Experience with Oracle, SQL Server, Sybase, DB2 technology is required. A bachelor's or master's degree in computer science, Engineering, or a related field is preferred. The successful candidate will possess excellent problem-solving skills, attention to detail, and the ability to work independently and collaboratively as part of a team. Strong communication skills are essential for this role. If you are currently located in Bangalore, Karnataka, or willing to relocate before starting work, you are encouraged to apply. Additionally, please be prepared to answer application questions related to your current and expected CTC, availability to join, current location, and experience in Data Warehousing, data models, ETL design, and TDM tools. A minimum of 4 years of experience in SQL is also required for this position. This is a full-time, permanent position that requires in-person work at the designated location.,
Posted 1 month ago
4.0 - 8.0 years
0 - 0 Lacs
guwahati, assam
On-site
As a Senior Data Engineer specialized in Scala, you will be responsible for leading Spark 3.X, SCALA, Delta lake implementation, and streaming solution implementation for IOT in Spark streaming. Your expertise in Kafka is essential for this role. Any prior experience with MFG BI, DWH, and Datalake implementation will be considered a bonus. This position offers the flexibility of working from home in India and requires a total experience of 10+ years with at least 4-5 years in Scala. The role is permanent under us with an annual salary range of INR 20-25 LPA. The notice period for this position is immediate to 30 days, and the interview process consists of 2 or 3 rounds. Key Responsibilities: - Understand the factories, manufacturing process, data availability, and avenues for improvement. - Collaborate with engineering, manufacturing, and quality teams to identify problems solvable using the acquired data in the data lake platform. - Define necessary data and collaborate with connectivity engineers and users to collect the data. - Develop and maintain optimal data pipeline architecture. - Assemble large, complex datasets meeting functional and non-functional business requirements. - Identify, design, and implement process improvements, automate manual processes, and optimize data delivery for scalability. - Work on data preparation, data deep dive, and help engineering, process, and quality teams understand process/machine behavior closely using available data. - Deploy and monitor solutions. - Collaborate with data and analytics experts to enhance functionality in data systems. - Work alongside Data Architects and data modeling teams. Skills / Competencies: - Solid knowledge of the business vertical with experience in solving use cases in manufacturing or similar industries. - Ability to apply cross-industry learning to enhance manufacturing processes. - Strong problem scoping, solving, and quantification skills. - Proficient in working with unstructured datasets and building data transformation processes. - Experience with message queuing, stream processing, and scalable big data data stores. - Skilled in data mining and data wrangling techniques for analytical dataset creation. - Proficient in building and optimizing big data pipelines, architectures, and datasets. - Adaptive mindset to address data challenges and drive desired outcomes. - Experience with Spark, Delta, CDC, NiFi, Kafka, relational SQL, NoSQL databases, and query languages. - Proficiency in object-oriented languages such as Scala, Java, C++. - Knowledge of visualization tools like PowerBI, Tableau for data presentation. - Ability to analyze data, generate findings, insights through exploratory data analysis. - Strong understanding of data transformation and connection across various data types. - Proficient in numerical, analytical skills, and identifying data acquisition opportunities. - Experience in enhancing data quality, reliability, building algorithms, and prototypes. - Ability to optimize existing frameworks for better performance. If you have the requisite expertise in Scala, Spark, and data engineering, and are keen to work on cutting-edge solutions for manufacturing processes, this role offers an exciting opportunity to make a significant impact in the domain.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |