Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
7 - 14 Lacs
Udaipur, Kolkata, Jaipur
Hybrid
Senior Data Engineer Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur,Kolkata Job Description: We are looking for a highly skilled and experienced Data Engineer with 46 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python Collaborate with data analysts, data scientists, and product teams to understand data needs Optimize queries and data models for performance and reliability Integrate data from various sources, including APIs, internal databases, and third-party systems Monitor and troubleshoot data pipelines to ensure data quality and integrity Document processes, data flows, and system architecture Participate in code reviews and contribute to a culture of continuous improvement Required Skills: 4-6 years of experience in data engineering, data architecture, or backend development with a focus on data Strong command of SQL for data transformation and performance tuning Experience with Python (e.g., pandas, Spark, ADF) Solid understanding of ETL/ELT processes and data pipeline orchestration Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) Basic Programming Skills Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. Exposure to enterprise solutions (e.g., Databricks, Synapse) Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) Background in real-time data streaming and event-driven architectures Understanding of data governance, security, and compliance best practices Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm
Posted 1 month ago
4.0 - 8.0 years
10 - 20 Lacs
Kolkata, Gurugram, Bengaluru
Work from Office
Job Opportunity for GCP Data Engineer Role: Data Engineer Location: Gurugram/ Bangalore/Kolkata (5 Days work from office) Experience : 4+ Years Key Skills: Data Analysis / Data Preparation - Expert Dataset Creation / Data Visualization - Expert Data Quality Management - Advanced Data Engineering - Advanced Programming / Scripting - Intermediate Data Storytelling- Intermediate Business Analysis / Requirements Analysis - Intermediate Data Dashboards - Foundation Business Intelligence Reporting - Foundation Database Systems - Foundation Agile Methodologies / Decision Support - Foundation Technical Skills: • Cloud - GCP - Expert • Database systems (SQL and NoSQL / Big Query / DBMS) - Expert • Data warehousing solutions - Advanced • ETL Tools - Advanced • Data APIs - Advanced • Python, Java, and Scala etc. - Intermediate • Some knowledge understanding the basics of distributed systems - Foundation • Some knowledge of algorithms and optimal data structures for analytics - Foundation • Soft Skills and time management skills - Foundation
Posted 1 month ago
10.0 - 12.0 years
35 - 40 Lacs
Kolkata, New Delhi, Bengaluru
Work from Office
Seeking Data Architect with strong ETL, data modeling, data warehouse, SQL, Pyspark and Cloud experience. Architect experience is mandatory. Looking for only Immediate to currently serving candidates.
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Kolkata
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled SQL Analysts with a minimum of 1 year of experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of SQL development experience Expertise in writing complex SQL queries, optimizing database performance Designing database schemas Proficient in working with relational database management systems like MySQL, PostgreSQL, or SQL Server Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Kolkata
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Data Analysts with a minimum of 1 year of experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Data Analysis experience Strong knowledge of Python, R, or SQL Proficiency in data visualization tools (e g , Tableau, Power BI) Statistical analysis expertise Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 1 month ago
3.0 - 8.0 years
7 - 17 Lacs
Bhubaneswar, Kolkata, Chennai
Work from Office
Job description Hiring for ETL/DWT/BigData Testing with experience range 3-10 years & above Mandatory Skills: ETL/DWT/BigData Testing Education: BE/B.Tech/MCA/M.Tech/MSc./MS
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Kolkata
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members and stakeholders. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Ensure effective communication among team members and stakeholders Identify and address any issues or bottlenecks in the development process Professional & Technical Skills: Must To Have Skills:Proficiency in Informatica MDM Strong understanding of data integration and master data management concepts Experience in designing and implementing Informatica MDM solutions Knowledge of data quality and governance best practices Hands-on experience with Informatica MDM tools and technologies Additional Information: The candidate should have a minimum of 7.5 years of experience in Informatica MDM This position is based at our Indore office A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Bhubaneswar, Kolkata, Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : AWS Administration Minimum 3 year(s) of experience is required Educational Qualification : BE Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Develop and implement efficient and scalable application solutions. Collaborate with cross-functional teams to analyze and address technical issues. Stay updated on industry trends and best practices. Conduct code reviews and provide constructive feedback. Assist in troubleshooting and resolving application issues. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Good To Have Skills:Experience with AWS Administration. Strong understanding of ETL processes and data integration. Knowledge of data warehousing concepts and methodologies. Experience in developing and optimizing data pipelines. Familiarity with database technologies and SQL queries. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bhubaneswar office. A BE degree is required. Qualifications BE
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Kolkata, Hyderabad, Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test software applications using Ab Initio. Troubleshoot and debug issues in existing applications. Ensure the scalability and performance of applications. Document technical specifications and user manuals. Stay up-to-date with industry trends and advancements in Ab Initio technology. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Good To Have Skills:Experience with data integration tools. Strong understanding of ETL concepts and data warehousing principles. Experience in designing and implementing complex data integration solutions. Knowledge of SQL and database concepts. Familiarity with Unix/Linux environments. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Kolkata
Work from Office
The Talend Open Studio role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Talend Open Studio domain.
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Kolkata
Work from Office
The Informatica Product 360 role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Informatica Product 360 domain.
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Kolkata
Work from Office
The Google Data Engineering role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Google Data Engineering domain.
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Kolkata
Work from Office
The STIBO Master Data Management role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the STIBO Master Data Management domain.
Posted 1 month ago
5.0 - 10.0 years
13 - 17 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
We are looking for a Data Engineering Lead with deep expertise in AWS, Python , and modern data engineering practices to architect and deliver scalable, secure, and efficient data solutions. In this role, you ll lead the design and development of ETL pipelines, implement robust data warehousing solutions, and support machine learning workflows across various business domains. Your strong understanding of ETL and Data Warehousing concepts , including Slowly Changing Dimensions (SCD) and Medallion Architecture , along with proficiency in SQL and AWS-native tools, will be key to success. Key Responsibilities Data Pipeline & Architecture Design, develop, and maintain scalable ETL pipelines leveraging AWS services such as Glue , S3 , Lambda , and Athena . Hands on Python and Pyspark coding experience. Implement Medallion Architecture (bronze, silver, gold layers) to organize and optimize data for both operational and analytical use cases. Apply best practices in data modeling and data warehousing , including handling SCD Type 1 and Type 2 transformations. Data Management & Quality Ensure data accuracy, completeness, and reliability across batch and near-real-time workflows. Create robust error handling, logging, and monitoring processes for pipeline operations. Write and optimize complex SQL queries for data transformations, validations, and reporting use cases. Collaboration & Delivery Partner with data engineers , analysts , and product teams to enable data access and support analytical models and dashboards. Participate in code reviews , contribute to best practices , and ensure high engineering standards. Troubleshoot and resolve data pipeline and infrastructure issues in production environments. Must-Have Qualifications 5+ years of hands-on experience in data engineering , particularly with Python , PySpark , and SQL . Proven experience building ETL pipelines using AWS services (Glue, Lambda, S3, Athena, Step Functions). Strong knowledge of Data Warehousing concepts , SCD Types (Type 1, Type 2) , and Medallion Architecture . Solid understanding of data modeling , partitioning , and performance optimization in big data environments. Excellent communication skills with the ability to explain technical topics to non-technical stakeholders. Nice-to-Have Skills Experience with DBT for data transformation and testing. Hands-on exposure to Databricks and the Lakehouse architecture . Familiarity with CI/CD for data pipelines and infrastructure-as-code using tools like Terraform or CloudFormation.
Posted 1 month ago
5.0 - 10.0 years
10 - 14 Lacs
Kolkata
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Stibo Product Master Data Management Good to have skills : Informatica MDMMinimum 5 year(s) of experience is required Educational Qualification : Graduate Summary :As an Application Lead for Packaged Application Development, you will be responsible for leading the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve working with Stibo Product Master Data Management and Informatica MDM to ensure successful project delivery. Roles & Responsibilities:- Lead the design, development, and implementation of Stibo Product Master Data Management solutions.- Collaborate with cross-functional teams to ensure successful project delivery, acting as the primary point of contact.- Provide technical leadership and guidance to team members, ensuring adherence to best practices and standards.- Develop and maintain project plans, timelines, and budgets, ensuring successful project delivery.- Identify and mitigate project risks, ensuring successful project delivery. Professional & Technical Skills: - Must To Have Skills: Experience with Stibo Product Master Data Management.- Must To Have Skills: Strong understanding of data modeling and data management concepts.- Good To Have Skills: Experience with Informatica MDM.- Strong understanding of software development life cycle (SDLC) methodologies.- Experience with Agile development methodologies.- Experience with project management tools such as JIRA or Microsoft Project. Additional Information:- The candidate should have a minimum of 5 years of experience in Stibo Product Master Data Management.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering successful projects.- This position is based at our Kolkata office. Qualification Graduate
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
As a Strategic Projects Lead (SPL), you will lead initiatives that will drive in new revenue for the business. This is a demanding role, and as an SPL, you should be prepared to wear many hats across all dimensions of Operations. The ideal SPL should have a strong entrepreneurial mindset, be comfortable getting into the weeds, and be excited about intense, impactful work that leads to accelerated career progression. You will: Serve as the full owner of our most visible and high impact customer pipelines, making decisions that directly impact data quality, operational efficiency, revenue, and margins Understand customer requirements and design data taxonomy best suited to improving model performance based on customer needs Build out pipeline infrastructure to ensure quality and efficiency Train, coach, and manage dynamic and global teams Build analytics to make data-driven decisions Partner with diverse stakeholders (Engineering + Product + Ops + Go-to-Market) to work on problems that will drive advancements for the largest LLMs in the world Give regular progress updates to Scales executive team Ideally, youd have: Strong technical background (a degree in engineering or computer science is ideal, and at minimum the role requires the ability to do data analytics using SQL or Python). 2+ years of experience leading a team, developing product or operational processes, or as a SWE. Strong problem solving capabilities (experience working on operational challenges is a plus) - ability to come up with creative solutions to complex, ambiguous operational problems Entrepreneurial experience and mindset - you are excited about building things from scratch and are able to identify issues and execute quickly
Posted 1 month ago
5.0 - 10.0 years
25 - 30 Lacs
Guwahati, Kolkata, Odisha
Work from Office
RoleService Operations Lead (AVP) WFH and WFO DepartmentService & Operations Employment TypeFull Time, Role CategoryOperations Job description: Technology Operations is responsible for the technical infrastructure required to supply IT services to the bank. Work includes: - Overseeing strategy, design, development, and deployment of IT solutions - Improving or developing new products, components, equipment, systems, technologies, or processes - Ensuring that research and design methodologies meet established scientific and engineering standards - Assisting with formulating business plans and budgets for product development - Analysing quality/safety test results to ensure compliance with internal and external standards - Keeping abreast of new developments in the industry and translating those developments into new and viable options for the organization and clients - Organising technical presentations to clients and/or industry groups Monitoring product development outcomes to ensure technical, functional, cost, and timing targets are met May be responsible for managing product regulatory approval process Your key responsibilities: - Management of support activities for service operations, such as troubleshooting and solving incidents and problems service delivery - Take overall responsibility for the resolution of incidents and problems within the team & oversee the resolution of complex incidents - Ensure analysts are using the correct troubleshooting methods and processes - Assistance in managing relationships with involved business partners - Manage escalations by collaborating with customer Service, other service operations specialists and relevant TDI functions to properly and quickly resolve escalated issues - Overview of areas that need monitoring, reporting and improvement - Identifying the required metrics and ensuring that they are addressed, monitored and improved where necessary - Experience with multiple technology stacks (Unix, PLSQL, JAVA etc.) - Working knowledge of standard tools (Service-Now, Confluence, JIRA) - ITIL & GCP qualified desirable Your skills and experience: - Prior experience in defining, developing, implementing Automic (UC4) workload processing according to requirements, Batch processing and related tasks. - Ability to reverse engineer processes and systems when necessary. - Familiar with UNIX, Shell & Perl scripting, and command line environments. - Familiar with documentation in tools like JIRA, Confluence and ServiceNow. - Prior working involving use of ITIL standards and procedures, naming conventions, client concept, alarming concept, scripting, centralized includes. - Proficiency in ETLF tool (Informatica) and SAP (good to have) - Proficiency in SQL for data querying and manipulation (good to have) Expected Tasks to Perform: - Implement new tasks and workflows as per Customer requirement (scheduling, monitoring, and alarming). - Automate and optimize existing processes to improve efficiency and reliability. - Proven prior ability to plan and analyze workflow and processes to troubleshoot issues and for performance improvement. - Handle deployments and releases of new features and updates. - Analyze and classify errors to ensure timely resolution. - Manage incidents and contribute to root cause analysis. - Document processes, configurations, and solutions for future reference and knowledge sharing. Apply Save Save Pro Insights Location - Odisha,Guwahati,Kolkata,Pune,Chennai,Hyderabad,Mumbai,Thane,Navi Mumbai,Coimbatore,Vijayawada,Vishak
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Platform Engineer to build scalable infrastructure for data ingestion, processing, and analysis. Key Responsibilities: Architect distributed data systems. Enable data discoverability and quality. Develop data tooling and platform APIs. Required Skills & Qualifications: Experience with Spark, Kafka, and Delta Lake. Proficiency in Python, Scala, or Java. Familiar with cloud-based data platforms. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 1 month ago
1.0 - 6.0 years
3 - 8 Lacs
Kolkata
Work from Office
Job Responsibilities: Responsible for end-to-end report creation and delivery. Support client servicing by developing impactful client presentations and actionable insights. Conduct in-depth data analysis and design interactive dashboards to track key metrics and performance indicators. Collaborate with cross-functional teams to gather requirements and align deliverables with business goals. Essential Skills: Technical Skills: Strong understanding of MS Office, SQL, Power BI Analytical Skills: Analyzing data, identifying trends, and drawing conclusions. Communication Skills: Ability to communicate technical information clearly and concisely to both technical and non-technical audiences. Problem-Solving Skills: Ability to identify and resolve technical issues related to MIS systems and data management. Organizational Skills: Ability to manage multiple tasks and projects simultaneously. Attention to Detail: Accuracy and meticulousness in data entry and reporting. Knowledge of MIS Principles: Understanding of MIS concepts, data warehousing, and business intelligence. Experience: Min 1 year up to 4 Years Note: Work in a high-pressure environment
Posted 1 month ago
15.0 - 20.0 years
16 - 17 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
JD#1 Subject Matter Expert with minimum 15 Years overall experience in Data Engineering with hands on and architecture experience in Big Query. Complete Expert on BigQuery. Should have designed architected solutions on BigQuery, data warehousing and data lakes Should be expert on data pipelines with bigquery Should have managed 50+ Tb of data on BigQuery clusters with Dashboarding tools to provide insights Performance Tuning of BQ in production
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, Seeking a Cloud Monitoring Specialist to set up observability and real-time monitoring in cloud environments. Key Responsibilities: Configure logging and metrics collection. Set up alerts and dashboards using Grafana, Prometheus, etc. Optimize system visibility for performance and security. Required Skills & Qualifications: Familiar with ELK stack, Datadog, New Relic, or Cloud-native monitoring tools. Strong troubleshooting and root cause analysis skills. Knowledge of distributed systems. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
8.0 - 10.0 years
11 - 18 Lacs
Kolkata
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Architect to design and oversee scalable, secure, and cost-efficient cloud solutions. Great for architects who bridge technical vision with business needs. Key Responsibilities: Design cloud-native solutions using AWS, Azure, or GCP Lead cloud migration and transformation projects Define cloud governance, cost control, and security strategies Collaborate with DevOps and engineering teams for implementation Required Skills & Qualifications: Deep expertise in cloud architecture and multi-cloud environments Experience with containers, serverless, and microservices Proficiency in Terraform, CloudFormation, or equivalent Bonus: Cloud certification (AWS/Azure/GCP Architect) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
6.0 - 11.0 years
25 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
PureSoftware is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, Looking for a Cloud Data Engineer to build cloud-based data pipelines and analytics platforms. Key Responsibilities: Develop ETL workflows using cloud data services. Manage data storage, lakes, and warehouses. Ensure data quality and pipeline reliability. Required Skills & Qualifications: Experience with BigQuery, Redshift, or Azure Synapse. Proficiency in SQL, Python, or Spark. Familiarity with data lake architecture and batch/streaming. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
24052 Jobs | Dublin
Wipro
12710 Jobs | Bengaluru
EY
9024 Jobs | London
Accenture in India
7651 Jobs | Dublin 2
Uplers
7362 Jobs | Ahmedabad
Amazon
7248 Jobs | Seattle,WA
Oracle
6567 Jobs | Redwood City
IBM
6559 Jobs | Armonk
Muthoot FinCorp (MFL)
6161 Jobs | New Delhi
Capgemini
5158 Jobs | Paris,France