Home
Jobs

126 Data Warehousing Jobs in Kolkata - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

9 - 13 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Title: Data BA About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job Title: Business Analyst About the Role : Role Description- Bangalore, Chennai, Pune & Hyderabad Data BA for Client Communication -Excellent verbal and written communication for interacting with stakeholders & developers. Analytical and Problem Solving -Analyse data, identify problems and develop solutions. Critical Thinking -ability to evaluate information, identify patterns and make decisions. Build rapport and manage stakeholder expectations and collaborate with cross functional teams. Requirement elicitation - hold workshops to elicit requirements, document the business requirements effectively and obtain sign off, including Capabilities mapping to epics, and user stories in Jira. Ability to understand as is and to be business process and document them in Visio. Data Cleaning/Preparation - Proficient in handling incomplete data sets, ensuring its quality and accuracy. Databases - familiar with SQL for accessing and manipulating data. Data visualisation - ability to create informative charts/graphs/dashboards to communicate with stakeholders. Programme languages - python and SQL. Data Warehousing - understanding how the data is stored and organised.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Kolkata

Work from Office

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies (including but not limited to PLM and MES) for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing and supply chain, and for managing the manufacturing data. - Grade Specific Focus on Digital Continuity and Manufacturing. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Kolkata

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Fulltime 15 years qualificationRole and Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment.13. Employee should be ready to work in shift B i.e. 12:30 pm to 10:30 pm14. Employee should be ready to work as individual contributor Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions.- This position is based at our Mumbai office. Qualification Fulltime 15 years qualification

Posted 1 week ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

Kolkata

Work from Office

We are looking for a skilled Data Engineer with strong hands-on experience in Clickhouse, Kubernetes, SQL, Python, and FastAPI, along with a good understanding of PostgreSQL. The ideal candidate will be responsible for building and maintaining efficient data pipelines, optimizing query performance, and developing APIs to support scalable data services. - Design, build, and maintain scalable and efficient data pipelines and ETL processes. - Develop and optimize Clickhouse databases for high-performance analytics. - Create RESTful APIs using FastAPI to expose data services. - Work with Kubernetes for container orchestration and deployment of data services. - Write complex SQL queries to extract, transform, and analyze data from PostgreSQL and Clickhouse. - Collaborate with data scientists, analysts, and backend teams to support data needs and ensure data quality. - Monitor, troubleshoot, and improve performance of data infrastructure. - Strong experience in Clickhouse - data modeling, query optimization, performance tuning. - Expertise in SQL - including complex joins, window functions, and optimization. - Proficient in Python, especially for data processing (Pandas, NumPy) and scripting. - Experience with FastAPI for creating lightweight APIs and microservices. - Hands-on experience with PostgreSQL - schema design, indexing, and performance. - Solid knowledge of Kubernetes managing containers, deployments, and scaling. - Understanding of software engineering best practices (CI/CD, version control, testing). - Experience with cloud platforms like AWS, GCP, or Azure. - Knowledge of data warehousing and distributed data systems. - Familiarity with Docker, Helm, and monitoring tools like Prometheus/Grafana.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 8 Lacs

Kolkata

Work from Office

We are looking for a Senior Python Developer with a passion for AI research and API development to join our growing team. In this role, you will be responsible for building scalable, high-performance APIs and contributing to AI/ML research and implementation. You will work closely with data scientists, researchers, and product teams to design and deploy intelligent systems that power our next-generation applications. Key Responsibilities Design, develop, and maintain Python-based APIs for AI/ML models and services Collaborate with AI researchers to implement and optimize machine learning models Conduct research into new AI/ML techniques and evaluate their applicability to business problems Build RESTful and GraphQL APIs using frameworks like FastAPI , Flask , or Django REST Framework Write clean, testable, and maintainable Python code with a focus on performance and scalability Participate in code reviews , mentor junior developers, and contribute to best practices Integrate AI models with backend systems and frontend applications Stay up-to-date with AI/ML trends , Python libraries (e.g., PyTorch , TensorFlow , Scikit-learn ), and API design patterns Work in an agile environment , delivering high-quality software in iterative sprints Qualifications Bachelors or Masters degree in Computer Science, Data Science, or a related field 4 + years of professional experience in software development, with 3 + years in Python Strong experience with Python web frameworks (e.g., FastAPI, Flask, Django) What Were Looking For in a Candidate A curious mind with a passion for AI and software development A team player who can mentor and guide others A self-starter who can take initiative and deliver results A lifelong learner who stays current with emerging technologies and trends Why Join Us? Work on cutting-edge AI projects with real-world impact Collaborate with top-tier researchers and engineers Flexible work environment and remote-friendly options Competitive salary and performance-based incentives Opportunities for professional growth and leadership A culture that values innovation, collaboration, and continuous learning

Posted 1 week ago

Apply

6.0 - 10.0 years

9 - 13 Lacs

Kolkata

Work from Office

The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure timely project delivery- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of data warehousing concepts- Experience in ETL processes- Knowledge of cloud data platforms- Hands-on experience in SQL development Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse- This position is based at our Kolkata office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 week ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Engineer to build and maintain data pipelines for our analytics platform. Perfect for engineers focused on data processing and scalability. Key Responsibilities: Design and implement ETL processes Manage data warehouses and ensure data quality Collaborate with data scientists to provide necessary data Optimize data workflows for performance Required Skills & Qualifications: Proficiency in SQL and Python Experience with data pipeline tools like Apache Airflow Familiarity with big data technologies (Spark, Hadoop) Bonus: Knowledge of cloud data services (AWS Redshift, Google BigQuery) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Title: Automation EngineerDatabricks. Job Type: Full-time, Contractor. Location: Hybrid Hyderabad | Pune| Delhi. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. We are seeking a detail-oriented and innovative Automation EngineerDatabricks to join our customer's team. In this critical role, you will design, develop, and execute automated tests to ensure the quality, reliability, and integrity of data within Databricks environments. If you are passionate about data quality, thrive in collaborative environments, and excel at both written and verbal communication, we'd love to meet you.. Key Responsibilities:. Design, develop, and maintain robust automated test scripts using Python, Selenium, and SQL to validate data integrity within Databricks environments.. Execute comprehensive data validation and verification activities to ensure accuracy and consistency across multiple systems, data warehouses, and data lakes.. Create detailed and effective test plans and test cases based on technical requirements and business specifications.. Integrate automated tests with CI/CD pipelines to facilitate seamless and efficient testing and deployment processes.. Work collaboratively with data engineers, developers, and other stakeholders to gather data requirements and achieve comprehensive test coverage.. Document test cases, results, and identified defects; communicate findings clearly to the team.. Conduct performance testing to ensure data processing and retrieval meet established benchmarks.. Provide mentorship and guidance to junior team members, promoting best practices in test automation and data validation.. Required Skills and Qualifications:. Strong proficiency in Python, Selenium, and SQL for developing test automation solutions.. Hands-on experience with Databricks, data warehouse, and data lake architectures.. Proven expertise in automated testing of data pipelines, preferably with tools such as Apache Airflow, dbt Test, or similar.. Proficient in integrating automated tests within CI/CD pipelines on cloud platforms (AWS, Azure preferred).. Excellent written and verbal communication skills with the ability to translate technical concepts to diverse audiences.. Bachelor’s degree in Computer Science, Information Technology, or a related discipline.. Demonstrated problem-solving skills and a collaborative approach to teamwork.. Preferred Qualifications:. Experience with implementing security and data protection measures in data-driven applications.. Ability to integrate user-facing elements with server-side logic for seamless data experiences.. Demonstrated passion for continuous improvement in test automation processes, tools, and methodologies.. Show more Show less

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Title: Data Quality & Automation Engineer.. Job Type: Full-time, Contractor.. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary. We are seeking a skilled and innovative Data Quality & Automation Engineer to join our customer's dynamic team. In this role, you will leverage your expertise to ensure the quality and reliability of our data processing systems, playing a crucial role in our commitment to excellence. We are looking for a candidate who possesses a keen eye for detail and a strong ability to communicate both verbally and in writing.. Key Responsibilities. Develop and execute automated test scripts using Python and Selenium to validate data processing systems.. Perform rigorous data validation and ensure data integrity across various data platforms.. Collaborate with data engineers and developers to identify and troubleshoot issues.. Maintain and enhance existing automation frameworks and scripts.. Utilize SQL for advanced data querying and validation tasks.. Implement and manage workflows using Apache Airflow.. Work with Databricks to test data pipelines and transformations.. Required Skills and Qualifications. Proven experience in automation testing with a focus on data quality.. Proficiency in Python programming and Selenium automation tools.. Strong understanding of SQL for data validation and reporting.. Experience with ALM.. Knowledge of data warehousing and data lake architectures.. Experience in leading and mentoring teams.. Experience with data testing tools (dbt Test).. Experience with Apache Airflow for workflow management.. Familiarity with Databricks for data processing and analytics.. Exceptional written and verbal communication skills.. Attention to detail and a proactive approach to problem-solving.. Preferred Qualifications. Experience with cloud platforms (AWS, Azure) and big data technologies.. Knowledge of continuous integration and deployment processes.. Certification in data engineering or software testing is a plus.. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Our Company. We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market.. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole.. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us.. Job Summary. We are seeking a highly skilled and analytical Senior Business Intelligence (BI) Analyst to join our team. The ideal candidate will have extensive experience in data analysis, reporting, and BI tools. This role will be crucial in transforming data into actionable insights to support business decision-making and strategy.. Key Responsibilities. Design, develop, and maintain BI solutions, including dashboards and reports.. Analyze complex data sets to identify trends, patterns, and insights.. Collaborate with business stakeholders to understand their data needs and provide analytical support.. Ensure data accuracy and integrity by performing data validation and quality checks.. Develop and implement data models and data visualization techniques.. Provide training and support to end-users on BI tools and reports.. Stay updated with the latest BI technologies and best practices.. Lead and mentor junior BI analysts within the team.. Qualifications. Bachelor's or Master's degree in Computer Science, Information Systems, Business Analytics, or a related field.. Extensive experience in business intelligence, data analysis, and reporting.. Proficiency in BI tools such as Tableau, Power BI, or QlikView.. Strong SQL skills and experience with data warehousing concepts.. Excellent analytical and problem-solving skills.. Strong communication and presentation skills.. Ability to work collaboratively with cross-functional teams.. Preferred Skills. Experience with cloud-based data platforms such as AWS, GCP, or Azure.. Knowledge of programming languages such as Python or R.. Familiarity with machine learning and predictive analytics.. Experience with Tableau, Power BI, and Looker is a plus.. Certification in BI tools or data analytics.. About Us. We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future.. Championing diversity, equity, and inclusion. Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team.. How We Look After You. We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with.. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.. Show more Show less

Posted 2 weeks ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Cloud Data Scientist to build and scale data science solutions in cloud-native environments. Ideal for candidates who specialize in analytics and machine learning using cloud ecosystems. Key Responsibilities: Design predictive and prescriptive models using cloud ML tools Use BigQuery, SageMaker, or Azure ML Studio for scalable experimentation Collaborate on data sourcing, transformation, and governance in the cloud Visualize insights and present findings to stakeholders Required Skills & Qualifications: Strong Python/R skills and experience with cloud ML stacks (AWS, GCP, or Azure) Familiarity with cloud-native data warehousing and storage (Redshift, BigQuery, Data Lake) Hands-on with model deployment, CI/CD, and A/B testing in the cloud Bonus: Background in NLP, time series, or geospatial analysis Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 2 weeks ago

Apply

3.0 - 8.0 years

15 - 16 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

We are looking forward to hire Snowflake Professionals in the following areas : JD as below Snowflake SnowSQL, PL/SQL Any ETL Tool : 3+ years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool. Good interpersonal skills, experience in handling communication and interactions between different teams. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

5.0 - 10.0 years

2 - 3 Lacs

Kolkata, Ramgarh

Work from Office

Sodexo Food Solutions India Pvt. Ltd.cesHR Cum MIS Executive to join our dynamic team and embark on a rewarding career journey Sound Knowledge hands on experience on H-look Up, V-Look Up, Pivot Table, Conditional Formatting etc Good in preparing MIS Report Perform data analysis for generating reports on periodic basis Provide strong reporting and analytical information support Knowledge of various MIS reporting tools

Posted 2 weeks ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Engineering Manager to lead a team building data pipelines, models, and analytics infrastructure. Ideal for experienced engineers who can manage both technical delivery and team growth. Key Responsibilities: Lead development of ETL/ELT pipelines and data platforms Manage data engineers and collaborate with analytics/data science teams Architect systems for data ingestion, quality, and warehousing Define best practices for data architecture, testing, and monitoring Required Skills & Qualifications: Strong experience with big data tools (Spark, Kafka, Airflow) Proficiency in SQL, Python, and cloud data services (e.g., Redshift, BigQuery) Proven leadership and team management in data engineering contexts Bonus: Experience with real-time streaming and ML pipeline integration Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 2 weeks ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Kolkata

Work from Office

Your Role Excellent in Tableau schema, extract, Dashboard design, implementation, maintenance, and Dashboard development Good knowledge on SQL and database concepts Experience with all the components of Tableau suite including but not limited to Tab, Desktop, Tableau Prep and Tableau Architecture Your Profile Design & develop solutions using Tableau Dashboards (Web and Mobile) with good knowledge on SQL and database concepts. Experience with all the components of Tableau suite including but not limited to Tab, Desktop, Tableau Prep and Tableau Architecture Must have strong Experience in Tableau Development in Reports, dashboards, and documents. What youll love about working here Choosing Capgemini means having the opportunity to make a difference, whether for the worlds leading businesses or for society. It means getting the support you need to shape your career in the way that works for you. It means when the future doesnt look as bright as youd like, you have the opportunity to make change: to rewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun. About Capgemini

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Kolkata, Bengaluru

Hybrid

Required Experience: • Design, develop, and maintain ETL/ELT workflows using Informatica IICS. • Collaborate with business and technical teams to understand requirements and translate them into robust data integration solutions. • Optimize data pipelines for performance and scalability. • Integrate IICS solutions with cloud-based data stores like Google BigQuery and cloud storage solutions. • Develop data mappings, task flows, parameter files, and reusable objects. • Manage deployments, migrations, and version control for IICS assets. • Perform unit testing, debugging, and troubleshooting of ETL jobs. • Document data flow and architecture as part of the SDLC. • Work in an Agile environment and participate in sprint planning, reviews, and retrospectives. • Provide mentorship and code reviews for junior developers, ensuring adherence to best practices and coding standards. Skills & Qualifications: • Bachelors or Master’s degree in Computer Science, Information Systems, or related field. • 7+ years of experience in ETL development with at least 2–3 years in Informatica IICS. • Strong experience in data integration, transformation, and orchestration using IICS. • Good working knowledge of cloud data platforms, preferably Google Cloud Platform (GCP). • Hands-on experience with Google BigQuery (GBQ) including writing SQL queries, data ingestion, and optimization. • Strong SQL skills and experience with RDBMS (e.g., Oracle, SQL Server, PostgreSQL). • Experience in integrating data from various sources including on-prem, SaaS applications, and cloud data lakes. • Familiarity with data governance, data quality, and data cataloging tools. • Understanding of REST APIs and experience with API integration in IICS. • Excellent problem-solving skills and attention to detail. • Strong communication skills and the ability to work effectively in a team.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Kolkata, Siliguri, Asansol

Work from Office

10+ years of strong experience with data transformation ETL on large data sets Experience with designing customer centric datasets (ie, CRM, Call Center, Marketing, Offline, Point of Sale etc) 5+ years of Data Modeling experience (ie, Relational, Dimensional, Columnar, Big Data) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts Experience in industry ETL tools (ie, Informatica, Unifi) Experience with Business Requirements definition and management, structured analysis, process design, use case documentation Experience with Reporting Technologies (ie, Tableau, PowerBI) Experience in professional software development Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects Strong verbal written communication skills to interface with Sales team lead customers to successful outcome Must be self-managed, proactive and customer focused Degree in Computer Science, Information Systems, Data Science, or related field

Posted 2 weeks ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Kolkata

Work from Office

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

8 - 14 Lacs

Kolkata

Work from Office

Job Description : We are looking for a skilled Data Engineer with strong hands-on experience in Clickhouse, Kubernetes, SQL, Python, and FastAPI, along with a good understanding of PostgreSQL. The ideal candidate will be responsible for building and maintaining efficient data pipelines, optimizing query performance, and developing APIs to support scalable data services. - Design, build, and maintain scalable and efficient data pipelines and ETL processes. - Develop and optimize Clickhouse databases for high-performance analytics. - Create RESTful APIs using FastAPI to expose data services. - Work with Kubernetes for container orchestration and deployment of data services. - Write complex SQL queries to extract, transform, and analyze data from PostgreSQL and Clickhouse. - Collaborate with data scientists, analysts, and backend teams to support data needs and ensure data quality. - Monitor, troubleshoot, and improve performance of data infrastructure. - Strong experience in Clickhouse - data modeling, query optimization, performance tuning. - Expertise in SQL - including complex joins, window functions, and optimization. - Proficient in Python, especially for data processing (Pandas, NumPy) and scripting. - Experience with FastAPI for creating lightweight APIs and microservices. - Hands-on experience with PostgreSQL - schema design, indexing, and performance. - Solid knowledge of Kubernetes managing containers, deployments, and scaling. - Understanding of software engineering best practices (CI/CD, version control, testing). - Experience with cloud platforms like AWS, GCP, or Azure. - Knowledge of data warehousing and distributed data systems. - Familiarity with Docker, Helm, and monitoring tools like Prometheus/Grafana.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Kolkata

Work from Office

Job Summary : We are seeking an experienced Data Engineer with strong expertise in Databricks, Python, PySpark, and Power BI, along with a solid background in data integration and the modern Azure ecosystem. The ideal candidate will play a critical role in designing, developing, and implementing scalable data engineering solutions and pipelines. Key Responsibilities : - Design, develop, and implement robust data solutions using Azure Data Factory, Databricks, and related data engineering tools. - Build and maintain scalable ETL/ELT pipelines with a focus on performance and reliability. - Write efficient and reusable code using Python and PySpark. - Perform data cleansing, transformation, and migration across various platforms. - Work hands-on with Azure Data Factory (ADF) for at least 1.5 to 2 years. - Develop and optimize SQL queries, stored procedures, and manage large data sets using SQL Server, T-SQL, PL/SQL, etc. - Collaborate with cross-functional teams to understand business requirements and provide data-driven solutions. - Engage directly with clients and business stakeholders to gather requirements, suggest optimal solutions, and ensure successful delivery. - Work with Power BI for basic reporting and data visualization tasks. - Apply strong knowledge of data warehousing concepts, modern data platforms, and cloud-based analytics. - Adhere to coding standards and best practices, including thorough documentation and testing (unit, integration, performance). - Support the operations, maintenance, and enhancement of existing data pipelines and architecture. - Estimate tasks and plan release cycles effectively. Required Technical Skills : - Languages & Frameworks : Python, PySpark - Cloud & Tools : Azure Data Factory, Databricks, Azure ecosystem - Databases : SQL Server, T-SQL, PL/SQL - Reporting & BI Tools : Power BI (PBI) - Data Concepts : Data Warehousing, ETL/ELT, Data Cleansing, Data Migration - Other : Version control, Agile methodologies, good problem-solving skills Preferred Qualifications : - Experience with coding in Pysense within Databricks (added advantage) - Solid understanding of cloud data architecture and analytics processes - Ability to independently initiate and lead conversations with business stakeholders

Posted 3 weeks ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Kolkata, Mumbai (All Areas)

Hybrid

You will be part of the global Marketing & Creative Services team and will be responsible for working with marketing teams (onshore & offshore) to Build enterprise data and analytics solutions generating meaningful KPI dashboards for Capgemini Group Drive BI and Data Analytics solutions for measuring strategies aligned to marketing and business objectives. Understand the overall digital landscape of the organization as well as that of key competitors. Key Responsibilities Participate in marketing/campaign briefing, solution architecture building and design activities. Manage the implementation of tracking tags using Google Tag Manager to measure web & marketing campaign performance Develop integrated corporate visualization solutions derived from multiple data sources using state-of-the-art tools to enable insight and decision-making at various levels Develop data models, including performance measures, to enable scorecard analysis Perform ad-hoc analysis and present results in a clear manner Learn and establish data visualization standards and explore new, cutting-edge tools to periodically raise those standards Ensure accuracy, completeness and timeliness of reporting through proactive awareness, alerts, testing and processes Perform exploratory data analysis and data pre-processing to arrive at patterns and trends. Communication Working closely with the platforms team, digital & marketing teams. Collaborate with wider business stakeholders from Group Good presentation skills for reports and dashboards Specification / Skills / Experience Must have high level of proficiency in data visualization tools - Power BI and/or Looker Studio, and/ or Tableau, DOMO Hands on experience in implementing and managing Google Tags solutions using GTM or other tag management tools Experience with JavaScript, HTML, and CSS for custom tag implementations will be an Add on Strong understanding of digital marketing KPIs & tracking methods. Capability to create data story with the help of dashboard in defined timelines Strong knowledge of data warehouse concepts including: data modelling, data quality, ETL, reporting, analytics, and visualization Good grasp over Amazon AWS/RedShift/Azure/GCP or other Datawarehouse providers Knowledge on data extraction and data cleansing methodologies Must have proficiency in SQL to write analytical queries, create filters and calculated data sets Strong critical thinking, analytical, organizational, interpersonal, verbal and written communication skills Highly motivated and able to work independently as well as in a team environment Ability to operate comfortably and effectively in a fast-paced, highly cross-functional, rapidly changing environment Excellent analytical skills with demonstrated ability to solve problems Agile mindset and good understanding of MVP / iterative development is desired

Posted 3 weeks ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Kolkata

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI.

Posted 3 weeks ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Kolkata

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Building Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and organizational standards, facilitating smooth data integration and accessibility across different systems. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data modeling techniques and methodologies.- Experience with data integration and ETL processes.- Familiarity with database management systems and SQL.- Ability to translate business requirements into technical specifications. Additional Information:- The candidate should have minimum 12 years of experience in Data Building Tool.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Kolkata

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP FI CO Finance Good to have skills : NA Educational Qualification : 15 years of full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of enhancements and maintenance tasks, while also contributing to the development of new features that meet client needs. You will be responsible for troubleshooting issues and providing solutions, ensuring that the application remains robust and efficient. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct thorough testing and debugging of application components to ensure high-quality deliverables. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FI CO Finance.- Strong understanding of financial processes and accounting principles.- Experience with integration of SAP modules and data migration.- Familiarity with SAP reporting tools and financial analysis.- Ability to troubleshoot and resolve issues within SAP environments. Additional Information:- The candidate should have minimum 3 years of experience in SAP FI CO Finance.- This position is based at our Kolkata office.- A 15 years of full time education is required. Qualification 15 years of full time education

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies