Home
Jobs

126 Data Warehousing Jobs in Kolkata - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

7 - 13 Lacs

Kolkata

Work from Office

Job Summary: We are seeking a skilled and motivated Data Engineer with 3-5 years of experience to join our growing data team. The ideal candidate will be responsible for designing, developing, testing, deploying, and maintaining robust, scalable, and efficient data pipelines and infrastructure. You will work closely with data scientists, analysts, software engineers, and business stakeholders to understand data requirements and deliver high-quality data solutions that drive business insights and decisions. Key Responsibilities: Design, build, and maintain scalable and reliable ETL/ELT data pipelines to ingest, transform, and load data from diverse sources (e.g., relational databases, APIs, streaming platforms, flat files). Develop and manage data warehousing solutions, ensuring data integrity, optimal performance, and cost-effectiveness. Implement data models, data schemas, and data dictionaries to support business and analytical requirements. Ensure data quality, consistency, and accuracy across all data systems by implementing data validation, cleansing, and monitoring processes. Optimize data pipeline performance and troubleshoot data-related issues. Collaborate with data scientists and analysts to provide them with clean, well-structured, and readily accessible data for their analysis and modelling needs. Implement and maintain data security and governance best practices. Automate data processes and workflows using scripting and orchestration tools. Document data pipelines, architectures, and processes. Stay up to date with emerging data technologies and best practices, and recommend improvements to our existing data stack. Required Skills & Qualifications: Bachelors or master’s degree in computer science, Engineering, Information Systems, or a related technical field. 5-8 years of hands-on experience in a Data Engineering role. Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Proficiency in Python. Experience with building and optimizing data pipelines using ETL/ELT tools and frameworks (e.g., Apache Airflow, dbt, Informatica, Talend, custom scripts). Hands-on experience with big data technologies (e.g., Apache Spark, Hadoop ecosystem - HDFS, MapReduce, Hive). Experience with cloud platforms (e.g., Azure - ADLS, Databricks, Synapse; GCP - GCS, BigQuery, Dataflow). Understanding of data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Redshift, BigQuery, Synapse Analytics). Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. Experience with version control systems (e.g., Git). Strong analytical and problem-solving skills. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Ability to manage multiple tasks and projects simultaneously. Preferred/Bonus Skills: Experience with real-time data streaming technologies (e.g., Apache Kafka, Kinesis, Flink, Spark Streaming). Knowledge of containerization and orchestration (e.g., Docker, Kubernetes). Familiarity with CI/CD pipelines for data engineering

Posted 1 month ago

Apply

6 - 11 years

4 - 8 Lacs

Kolkata

Work from Office

At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the worlds mostinnovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as theyprovide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days arethe same. Your Role Excellent in Tableau schema, extract, Dashboard design, implementation, maintenance, and Dashboard development Good knowledge on SQL and database concepts Experience with all the components of Tableau suite including but not limited to Tab, Desktop, Tableau Prep and Tableau Architecture Your Profile Design & develop solutions using Tableau Dashboards (Web and Mobile) with good knowledge on SQL and database concepts. Experience with all the components of Tableau suite including but not limited to Tab, Desktop, Tableau Prep and Tableau Architecture Must have strong Experience in Tableau Development in Reports, dashboards, and documents. What youll love about working here Choosing Capgemini means having the opportunity to make a difference, whether for the worlds leading businesses or for society. It means getting the support you need to shape your career in the way that works for you. It means when the future doesnt look as bright as youd like, you have the opportunity to make changeto rewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Kolkata

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

3 - 6 years

14 - 19 Lacs

Kolkata

Work from Office

We are looking for a skilled Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who can design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. The ideal candidate should have strong analytical, problem-solving, programming, business KPIs understanding, and communication skills. ### Roles and Responsibility Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Analyze data within Palantir to extract insights for easy interpretation and exploratory data analysis. Utilize programming languages and scripts to interact with the data and perform analyses. Optimize data storage and retrieval based on OLAP engine principles. Integrate Palantir with other systems and applications using APIs for seamless data flow. Collaborate with stakeholders to identify opportunities for continuous improvement in data processes and solutions. ### Job Requirements Strong experience in Data Warehousing, Data Engineering, and Data Modelling problem statements. Hands-on knowledge of Palantir Solutions such as Usecare, DTI, Code Repository, Pipeline Builder, etc. Experience with distributed frameworks and automation using Spark APIs. Knowledge of security-related principles to ensure data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Self-driven learning of technologies adopted by organizational requirements. Ability to work independently and consistently meet deadlines. Detail-oriented team member who can multitask and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains.

Posted 1 month ago

Apply

8 - 11 years

15 - 20 Lacs

Kolkata

Work from Office

We are looking for a highly motivated and experienced Data Modeller – Manager Level specializing in Life Insurance accounts with 8-11 years of industry experience. The ideal candidate will lead data modelling efforts across various projects, driving the design, development, and governance of data architectures to support business intelligence, analytics, and operational data requirements. ### Roles and Responsibility Lead the development and governance of conceptual, logical, and physical data models to support data strategies for life insurance accounts. Oversee the implementation of RDBMS, operational data stores (ODS), data marts, and data lakes on SQL/NoSQL platforms, ensuring scalability and efficiency. Optimize data architectures and query performance by applying industry best practices and advanced modelling techniques. Collaborate with business stakeholders and IT teams to transform business requirements into comprehensive data strategies and architecture solutions. Establish and enforce data modelling standards and design practices that adhere to enterprise data governance policies. Define data flows and create data mappings to ensure seamless integration, enhanced data quality, and efficient reporting. Identify and recommend solutions for data infrastructure, security, automation, and interface design to support robust data ecosystems. Provide guidance and mentorship to teams, promoting best practices and maintaining consistency in data modelling across projects. Perform hands-on data modelling, design, performance tuning, and proof-of-concept (POC) development to validate models and solutions. Manage project timelines, mitigate risks, and address challenges to ensure on-time and successful delivery of data initiatives. ### Job Requirements Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Proven experience in data modelling within the insurance domain (Life Insurance is preferred). Strong expertise in designing and managing relational, dimensional, and NoSQL data models. Proficiency in data modelling tools and platforms (e.g., ER/Studio, ERwin, or similar). Deep understanding of data architecture principles, ETL processes, and data warehousing concepts. Knowledge of cloud platforms and technologies (Azure, AWS, or GCP) is a plus. Ability to manage multiple projects and collaborate with cross-functional teams. Excellent problem-solving, communication, and leadership skills. Good exposure to any ETL tools. Good knowledge about Life insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance domain. Prior Client facing skills, Self-motivated and collaborative.

Posted 1 month ago

Apply

3 - 5 years

7 - 15 Lacs

Kolkata

Work from Office

Sundew is a growing company with 18 years of expertise in Digital Transformation, excelling in Digital Strategy, Cloud Native Application Development, AI, and Product Engineering. We are seeking an experienced SAP Analytics Consultant with a proven track record in SAP Datasphere implementations to join our growing team of data professionals. The ideal candidate will have at least one full lifecycle implementation of SAP Datasphere (formerly SAP Data Warehouse Cloud) and preferential consideration will be given to candidates with Databricks knowledge or experience. Responsibilities Design, develop, and implement SAP Datasphere solutions Create and maintain semantic models, views, data flows, and task chains in SAP Datasphere Configure source system connections for both SAP and non-SAP data sources Implement data replication, federation, and real-time data integration strategies Develop comprehensive data models and analytics solutions in SAP Analytics Cloud (SAC) Collaborate with technical and business stakeholders to gather requirements and translate them into technical solutions Develop and maintain documentation for implemented solutions Stay current with SAP Datasphere and SAP Business Data Cloud features and best practices Required Qualifications At least 1 full lifecycle implementation of SAP Datasphere (formerly SAP Data Warehouse Cloud) 3-5 years of experience in SAP Analytics, Business Intelligence, or Data Warehousing Strong understanding of SAP HANA modelling techniques and SAP Analytics Cloud Experience with data integration tools such as Smart Data Integration (SDI), SAP Data Intelligence (DI), or SAP Landscape Transformation (SLT) Proficiency in creating and maintaining Datasphere components: Tables, Views, Intelligent Lookup, Data Flow, Task Chain, etc. Hands-on experience with SQL and data modelling. Experience with additional SAP tools: BW/4HANA, S/4HANA Analytics, CDS Views Experience in accessing and modelling data from S/4HANA systems Strong analytical and problem-solving skills Excellent communication skills and ability to explain complex technical concepts to non-technical stakeholders. SAP certifications in relevant technologies. Preferred Qualifications Experience with Databricks for data analytics, ML, and AI applications Experience with Python, Py Spark or other programming languages Exposure to data lakes and lake house architectures Experience with AWS, Azure or Google cloud platforms Experience with non-SAP visualization tools like Power BI What We Offer: Join a rapidly expanding, global organization leading the charge in digital transformation with cutting-edge technologies. Remote work environment to support work-life balance. Opportunity to grow in a fast-paced and emerging domain. Inclusive and collaborative team culture where your ideas matter. Competitive compensation package with performance-based incentives to reward excellence and drive measurable results.

Posted 1 month ago

Apply

4 - 6 years

16 - 18 Lacs

Kolkata

Work from Office

SQL Query T SQL - functions and package SQL Optimization and Performance improvement Knowledge of data warehousing concepts (Star Schema, Fact and dimension tables) Experience in SQL Server, SSIS SSIS Development Experience on data loads SQL Servers Total experience. SSIS Package configuration and optimization. ETL xfr, error handling, data flow components, script components, debugging. Input files : XML, CSV, flat, json Installation, backup, db setting, configurations Strong SQL Skills (Joins, subqueries, Aggregations, Window functions).

Posted 1 month ago

Apply

5 - 7 years

0 - 0 Lacs

Kolkata

Work from Office

Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Required Skills & Qualifications: - A degree (preferably an advanced degree) in Computer Science, Engineering or a related field - Senior developer having 8+ years of hands on development experience in Azure using ASB and ADF: Extensive experience in designing, developing, and maintaining data solutions/pipelines in the Azure ecosystem, including Azure Service Bus, & ADF. - Familiarity with MongoDB and Python is added advantage. Required Skills Azure Data Factory,Azure Service Bus,Azure,Mongodb

Posted 1 month ago

Apply

6 - 11 years

22 - 35 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Skill Combination: Snowflake + (Python or DBT) + (AWS or Azure) + SQL + Data warehousing Location: Kolkata Exp & CTC: Band Experience CTC Range (Fixed) 4B 4 to 7 years Up to 21 LPA 4C 7 to 11 years Up to 28 LPA 4D 10 to 16 years Up to 35 LPA Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python/DBT+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, DBT, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Kolkata

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary:As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities:- Design, build, and configure applications to meet business process and application requirements using Google BigQuery.- Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients.- Develop and maintain technical documentation, including design documents, test plans, and user manuals.- Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills:- Must To Have Skills:Proficiency in Google BigQuery.- Good To Have Skills:Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake.- Strong understanding of SQL and database design principles.- Experience with ETL tools and processes.- Experience with programming languages such as Python or Java. Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualifications 15 years or more of full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Bhubaneswar, Kolkata, Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Bhubaneswar. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software programs to meet business requirements. Collaborate with team members to design and develop applications. Troubleshoot and debug applications to ensure optimal performance. Conduct code reviews and provide feedback to improve code quality. Stay updated with industry trends and technologies to enhance application development. Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio. Strong understanding of ETL processes and data integration. Experience with data warehousing concepts and methodologies. Hands-on experience in developing and optimizing data pipelines. Knowledge of SQL and database management systems. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bhubaneswar office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Bhubaneswar, Kolkata, Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality software solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications based on business requirements. Troubleshoot and debug issues in existing applications. Ensure the performance, quality, and responsiveness of applications. Participate in code reviews to maintain code quality. Stay up-to-date with emerging technologies and industry trends. Provide technical guidance and support to junior team members. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Good To Have Skills:Experience with data integration tools. Strong understanding of ETL concepts and data warehousing principles. Experience in designing and developing ETL workflows using Ab Initio. Knowledge of SQL and database concepts. Familiarity with version control systems such as Git. Excellent problem-solving and analytical skills. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Bhubaneswar, Kolkata, Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Bhubaneswar. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software programs to meet business requirements. Collaborate with team members to design and develop applications. Troubleshoot and debug applications to ensure optimal performance. Conduct code reviews and provide feedback to improve code quality. Stay updated on industry trends and technologies to enhance application development. Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio. Strong understanding of ETL processes and data integration. Experience with data warehousing concepts and methodologies. Hands-on experience in developing and optimizing data pipelines. Good To Have Skills: Experience with data modeling and database design. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bhubaneswar office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Kolkata, Hyderabad, Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality software solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications based on business requirements. Troubleshoot and debug issues in existing applications. Ensure the performance, quality, and responsiveness of applications. Participate in code reviews to maintain code quality. Stay up-to-date with emerging technologies and industry trends. Provide technical guidance and support to junior team members. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Good To Have Skills:Experience with data integration tools. Strong understanding of ETL concepts and data warehousing principles. Experience in designing and developing ETL workflows using Ab Initio. Knowledge of SQL and database concepts. Familiarity with version control systems such as Git. Excellent problem-solving and analytical skills. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

7 - 12 years

9 - 14 Lacs

Kolkata

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities: Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data pipelines and ETL processes using Databricks Unified Data Analytics Platform. Design and implement data security and access controls for the data platform. Troubleshoot and resolve issues related to the data platform and data pipelines. Professional & Technical Skills: Must To Have Skills:Experience with Databricks Unified Data Analytics Platform. Must To Have Skills:Strong understanding of data modeling and database design principles. Good To Have Skills:Experience with cloud-based data platforms such as AWS or Azure. Good To Have Skills:Experience with data security and access controls. Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices. Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualifications Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

7 - 12 years

9 - 14 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute to key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the effort to design, build, and configure applications Act as the primary point of contact Manage the team and ensure successful project delivery Professional & Technical Skills: Must To Have Skills:Proficiency in Informatica MDM Strong understanding of data management principles and practices Experience in designing and implementing data integration solutions Knowledge of data quality and data governance concepts Experience with data modeling and database design Good To Have Skills:Experience with ETL tools such as Informatica PowerCenter Experience with data migration and data synchronization Familiarity with master data management best practices Additional Information: The candidate should have a minimum of 7.5 years of experience in Informatica MDM This position is based in Kolkata A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Kolkata, Bengaluru

Work from Office

Project Role : Infra Tech Support Practitioner Project Role Description : Provide ongoing technical support and maintenance of production and development systems and software products (both remote and onsite) and for configured services running on various platforms (operating within a defined operating model and processes). Provide hardware/software support and implement technology at the operating system-level across all server and network areas, and for particular software solutions/vendors/brands. Work includes L1 and L2/ basic and intermediate level troubleshooting. Must have skills : IBM DB2 Database Administration Good to have skills : No Function Specialty Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationBTech Summary :As an Infra Tech Support Practitioner, you will be responsible for providing ongoing technical support and maintenance of production and development systems and software products, both remote and onsite. You will work on various platforms, implementing technology at the operating system-level and performing basic and intermediate level troubleshooting. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Ensure hardware/software support for configured services running on various platforms. Implement technology at the operating system-level across all server and network areas. Provide L1 and L2/ basic and intermediate level troubleshooting. Maintain documentation of technical issues and resolutions. Collaborate with cross-functional teams to resolve technical issues efficiently. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM DB2 Database Administration. Strong understanding of database management principles. Experience in performance tuning and optimization of database systems. Knowledge of backup and recovery procedures for databases. Familiarity with database security best practices. Additional Information: The candidate should have a minimum of 3 years of experience in IBM DB2 Database Administration. This position is based at our Bengaluru office. A BTech degree with 15 years of full-time education is required. Qualifications 15 years full time educationBTech

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Stibo Product Master Data Management Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the effort to design, build, and configure applications Act as the primary point of contact Manage the team and ensure successful project delivery Professional & Technical Skills: Must To Have Skills:Proficiency in Stibo Product Master Data Management Strong understanding of data management principles and best practices Experience in designing and implementing data management solutions Knowledge of data integration and migration techniques Experience with data quality assessment and improvement Good To Have Skills:Experience with other Product Master Data Management tools Additional Information: The candidate should have a minimum of 5 years of experience in Stibo Product Master Data Management This position is based at our Kolkata office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

ClickUp is the world s only all-in-one productivity platform that flexes to the way people want to work. It replaces all individual workplace productivity tools with a single, unified platform that includes project management, document collaboration, whiteboards, spreadsheets, and AI. With our headquarters based in San Diego and a rapidly expanding global presence, we are shaping the future of work. Join our team at ClickUp, one of the fastest-growing SaaS companies worldwide, and help millions of users be more productive - saving them at least one day every week. In this dynamic and rapidly evolving digital landscape, the seamless coordination and efficient management of tasks and projects have become paramount. This role will be instrumental in assisting and empowering ClickUp, a cutting-edge productivity platform, to thrive and provide top-notch solutions. While your expertise will be utilized by ClickUp, Deel will serve as your Employer of Record in India. This unique collaboration will create an environment where your talent can truly shine, making a significant impact on the productivity and success of ClickUps operations. At ClickUp data is the key driver behind the decisions we make every day. Our Software Engineering team relies on accurate and up-to-date data to measure teams success in building a high quality and cost-effective product. An ideal candidate would have strong SQL skills, experience working with app performance data, and the ability to answer challenging product questions through the use of a data warehouse and raw data sources. The Role: Partner with software engineering managers (EMs) and executives to guide engineering decisions by extracting meaningful insights from large and complex datasets. Turn loosely defined problems into practical analyses. Gain a deep understanding of our users and the factors affecting app quality and performance, interpreting and conveying findings to key stakeholders. Take initiative to identify organizational needs and lead the right projects, while also supporting others with your expertise. Data Stack: AWS, Postgres, S3, Datadog, Snowflake, DBT, Hex. Qualifications: 3+ years experience as a Product Analyst or Product Data Scientist. A strong understanding of software business principles and SaaS metrics. Advanced skills in SQL transformations and relational databases. Experience building statistical models to understand key drivers of a metric using Python or R. Experience working with raw log-level data to drive actionable insights. Proficiency with BI tools (Tableau, Looker, QuickSight). Self-motivated, operationally-focused, and a problem-solver. Excellent interpersonal, written, and oral communication skills. Desirable: Experience working with Software Engineering teams on Engineering-driven initiatives. Experience with ClickUp data stack. #LI-Remote Unsure if you meet all the qualifications of this job description but are deeply excited about the roleWe hire based on ambition, grit, and a passion for improving the way people work. If you think ClickUp is the company for you, we encourage you to apply! ClickUp was founded on a culture of hard work, consistent growth, and a desire to break norms. We re a values-driven company and hire based on ambition, merit, and a willingness to do what it takes to succeed. We don t care where you re from, what you look like, or who you re in a relationship with we hire the best people for the job, and create an environment that supports employees on their journey to do the most exciting work of their lives! ClickUp is an Equal Opportunity Employer, and qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, or national origin. ClickUp collects and processes personal data in accordance with applicable data protection laws. If you are a European Job Applicant, see our privacy policy for further details. If you are a Philippine Job Applicant, see our privacy policy and our Philippine Data Privacy Notice for further details. Please note we are unable to sponsor or take over sponsorship of an employment visa for roles outside of engineering and product at this time. Sponsorship for engineering and product roles is not guaranteed, but is instead based on the business needs for that specific role at that time. Please reach out to the recruiter with any questions.

Posted 1 month ago

Apply

10 - 15 years

35 - 40 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

We are looking for an experienced Technical Lead or Technical Architect with expertise in Data Warehousing , DBT , Snowflake , Data Modeling , and Data Engineering . The ideal candidate will have strong hands-on experience in customer handling, requirements gathering, and analysis for new requirements and Change Requests (CR). You will lead and manage key technical initiatives, ensuring smooth delivery and execution of data engineering and warehousing solutions. Key Responsibilities: Lead and manage data warehousing projects, ensuring timely and efficient execution. Work extensively with Amazon Aurora , Amazon RDS , AWS DMS , Amazon DynamoDB , Oracle , and PL/SQL . Design and implement data models, data architecture, and data pipelines. Collaborate with clients to gather requirements, analyze needs, and provide solutions for new features and Change Requests (CR). Optimize database queries and maintain data integrity across multiple platforms. Provide technical guidance on data engineering best practices, ensuring the implementation of scalable, efficient, and secure solutions. Work closely with stakeholders to understand business requirements and translate them into technical solutions. Ensure proper documentation and reporting of data models, architecture, and configurations. Stay up to date with industry trends and innovations in data warehousing and engineering. Experience with Data Warehousing: Hands-on experience with DBT , Snowflake , Data Modeling , and Data Engineering . Strong Technical Skills: Proficiency in Amazon Aurora , Amazon RDS , AWS DMS , Amazon DynamoDB , Oracle , PL/SQL , and SQL . Data Architecture: Ability to design and implement scalable and efficient data architectures. Customer Handling: Experience in managing client relationships, gathering requirements, and analyzing business needs. Database Management: In-depth knowledge of relational and NoSQL databases and experience optimizing data pipelines. Problem-Solving Skills: Strong analytical and troubleshooting abilities to ensure data integrity and quality. Communication Skills: Excellent communication skills to interact with clients, stakeholders, and internal teams. Leadership Experience: Ability to lead technical teams and ensure the successful delivery of data-driven projects. Preferred Qualifications: Certifications in AWS or related data engineering fields. Experience working in Agile methodologies or similar frameworks. Previous experience in handling large-scale data engineering projects.

Posted 1 month ago

Apply

1 - 4 years

3 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job_Description":" This is a remote position. Overview : 7 \u2013 10 years of experience Strong experience in Informatica PowerCenter ETL Design and developing ETL Mappings, Mapplets, Workflows, Worklets Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, CDC, aggregations, star schema, etc) Develops stored procedures, database triggers and SQL queries Knowledge of appropriate data partitioning strategy in Data Warehouse Knowledge of Informatica Data Quality tool Working experience of CI/CD or DevOps tools like \u2013 Gitlab / Jenkins/Bit Bucket Benefits Diversity Inclusion: At Exavalu, we are committed to building a diverse and inclusive workforce. We welcome applications for employment from all qualified candidates, regardless of race, color, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We nurture a culture that embraces all individuals and promotes diverse perspectives, where you can make an impact and grow your career. Exavalu also promotes flexibility depending on the needs of employees, customers and the business. It might be part-time work, working outside normal 9-5 business hours or working remotely.. . ","Job_Type":"Full time","

Posted 1 month ago

Apply

1 - 4 years

3 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job_Description":" This is a remote position. Overview : 7 \u2013 10 years of experience Strong experience in Informatica PowerCenter ETL Design and developing ETL Mappings, Mapplets, Workflows, Worklets Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, CDC, aggregations, star schema, etc) Develops stored procedures, database triggers and SQL queries Knowledge of appropriate data partitioning strategy in Data Warehouse Knowledge of Informatica Data Quality tool Working experience of CI/CD or DevOps tools like \u2013 Gitlab / Jenkins/Bit Bucket Benefits Diversity Inclusion: At Exavalu, we are committed to building a diverse and inclusive workforce. We welcome applications for employment from all qualified candidates, regardless of race, color, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We nurture a culture that embraces all individuals and promotes diverse perspectives, where you can make an impact and grow your career. Exavalu also promotes flexibility depending on the needs of employees, customers and the business. It might be part-time work, working outside normal 9-5 business hours or working remotely.. . ","Job_Type":"Full time","

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

We are seeking a Data Analytics & Business Intelligence Lead with deep expertise in analytics, data warehousing, and cross-functional reporting This role is critical to shaping and driving our data strategy across all major functions including Finance, Supply Chain, and Revenue/Growth within our e-commerce ecosystem The ideal candidate will own the end-to-end analytics and reporting lifecycle, delivering actionable insights that directly influence strategic decisions and operational outcomes

Posted 1 month ago

Apply

5 - 10 years

10 - 14 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Ensure timely project delivery Provide guidance and support to team members Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse Strong understanding of data warehousing concepts Experience in ETL processes Knowledge of cloud data platforms Hands-on experience in SQL development Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse This position is based at our Kolkata office A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

2 - 5 years

11 - 13 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Person at this position has gained significant work experience to be able to apply their knowledge effectively and deliver results. Person at this position is also able to demonstrate the ability to analyse and interpret complex problems and improve change or adapt existing methods to solve the problem. Person at this position regularly interacts with interfacing groups / customer on technical issue clarification and resolves the issues. Also participates actively in important project/ work related activities and contributes towards identifying important issues and risks. Reaches out for guidance and advice to ensure high quality of deliverables. Person at this position consistently seek opportunities to enhance their existing skills, acquire more complex skills and work towards enhancing their proficiency level in their field of specialisation. Works under limited supervision of Team Lead/ Project Manager. Roles & Responsibilities Responsible for design, coding, testing, bug fixing, documentation and technical support in the assigned area. Responsible for on time delivery while adhering to quality and productivity goals. Responsible for adhering to guidelines and checklists for all deliverable reviews, sending status report to team lead and following relevant organizational processes. Responsible for customer collaboration and interactions and support to customer queries. Expected to enhance technical capabilities by attending trainings, self-study and periodic technical assessments. Expected to participate in technical initiatives related to project and organization and deliver training as per plan and quality. Education and Experience Required Engineering graduate, MCA, etc Experience: 2-5 years Competencies Description Data engineering TCB is applicable to one who 1) Creates databases and storage for relational and non-relational data sources 2) Develops data pipelines (ETL/ ELT) to clean , transform and merge data sources into usable format 3) Creates reporting layer with pre-packaged scheduled reports , Dashboards and Charts for self-service BI 4) Has experience on cloud platforms such as AWS, Azure , GCP in implementing data workflows 5) Experience with tools like MongoDB, Hive, Hbase, Spark, Tableau, PowerBI, Python, Scala, SQL, ElasticSearch etc. Platforms- AWS, Azure , GCP Technology Standard- NA Tools- MongoDB, Hive, Hbase, Tableau, PowerBI, ElasticSearch, Qlikview Languages- Python, R, Spark,Scala, SQL Specialization- DWH, BIG DATA ENGINEERING, EDGE ANALYTICS Must to have Skills

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies