Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You have extensive experience in analytics and large-scale data processing across diverse data platforms and tools. Your responsibilities will include managing data storage and transformation across AWS S3, DynamoDB, Postgres, and Delta Tables with efficient schema design and partitioning. You will develop scalable analytics solutions using Athena and automate workflows with proper monitoring and error handling. Ensuring data quality, access control, and compliance through robust validation, logging, and governance practices will be a crucial part of your role. Additionally, you will design and maintain data pipelines using Python, Spark, Delta Lake framework, AWS Step functions, Event Bridge, AppFlow, and OAUTH. The tech stack you will be working with includes S3, Postgres, DynamoDB, Tableau, Python, and Spark.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Snowflake Architect with over 12 years of experience in data warehousing, cloud architecture, and Snowflake implementations. Your expertise lies in designing, optimizing, and managing large-scale Snowflake data platforms to ensure scalability, performance, and security. You are expected to possess deep technical knowledge of Snowflake, cloud ecosystems, and data engineering best practices. Your key responsibilities will include leading the design and implementation of Snowflake data warehouses, data lakes, and data marts. You will define best practices for Snowflake schema design, clustering, partitioning, and optimization. Additionally, you will architect multi-cloud Snowflake deployments with seamless integration and design data sharing, replication, and failover strategies for high availability. You will be responsible for optimizing query performance using Snowflake features, implementing automated scaling strategies for dynamic workloads, and troubleshooting performance bottlenecks in large-scale Snowflake environments. Furthermore, you will architect ETL/ELT pipelines using Snowflake, Coalesce, and other tools, integrate Snowflake with BI tools, ML platforms, and APIs, and implement CDC, streaming, and batch processing solutions. In terms of security, governance, and compliance, you will define RBAC, data masking, row-level security, and encryption policies in Snowflake. You will ensure compliance with GDPR, CCPA, HIPAA, and SOC2 regulations and establish data lineage, cataloging, and auditing using Snowflake's governance features. As a leader, you will mentor data engineers, analysts, and developers on Snowflake best practices, collaborate with C-level executives to align Snowflake strategy with business goals, and evaluate emerging trends for innovation. Your required skills and qualifications include over 12 years of experience in data warehousing, cloud architecture, and database technologies, 8+ years of hands-on Snowflake architecture and administration experience, and expertise in SQL and Python for data processing. Deep knowledge of Snowflake features, experience with cloud platforms, and strong understanding of data modeling are also essential. Certification as a Snowflake Advanced Architect is a must. Preferred skills include knowledge of DataOps, MLOps, and CI/CD pipelines, as well as familiarity with DBT, Airflow, SSIS, and IICS.,
Posted 1 week ago
5.0 - 10.0 years
0 - 0 Lacs
Hyderabad, Bengaluru
Hybrid
Role & responsibilities AWS Lambda, AWS EC2, AWS S3, RESTful APIs, Java, REST API
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You should have a strong proficiency in MongoDB, Express, Angular 2+, Node JS, Java Script, Rest Api, CSS, and Html5. A minimum qualification of Any Graduation with Excellent Communication is required along with 3 to 6 years of experience. Your responsibilities will include familiarity with Restful APIs, AngularJS, MongoDB, NodeJs, and ExpressJs. You should have hands-on experience with Node modules like Socket, Passport, and Nodemailer. Additionally, you must possess hands-on experience in designing and defining the architecture of complex web-based applications and microservices. Proficiency in using GIT or any other version control systems is essential. As part of the role, you will be expected to understand requirements and be responsible for the analysis, coding, design, and development of web-based applications. You should be able to implement a robust set of services and APIs to power web applications. In-depth knowledge and work experience in NodeJS are a must, along with previous experience in building full-stack web applications using NodeJS/Express and integrating APIs from platforms such as Google, Facebook, and Twilio. Experience in JavaScript, jQuery, CSS3, and HTML5, as well as AngularJS - Angular 2+ and AWS S3, is required. You should also have knowledge of JavaScript framework Angular 2+ and good experience in Bootstrap or other CSS frameworks. Proficiency in Node Js, creating database schemas that represent and support business processes, implementing automated testing platforms and unit tests, and understanding code versioning tools like Git are all necessary for this role.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a skilled Backend Developer, you will be responsible for developing and maintaining scalable server-side applications using Node.js, Express.js, and TypeScript. Your role will involve designing robust and secure RESTful APIs with proper routing, middleware, and error handling. You will build and optimize relational database schemas using PostgreSQL, ensuring performance, normalization, and data integrity. In this position, you will integrate and manage ORMs like Prisma or TypeORM for efficient and type-safe database operations. Additionally, you will implement authentication and authorization using JWT, session-based methods, and OAuth protocols. It will be your responsibility to validate request and response data using Zod or Joi to ensure type safety and data integrity. Furthermore, you will handle file uploads and media storage using Multer and integrate with services like Cloudinary, AWS S3, or similar platforms. Writing clean, testable, and modular code following SOLID principles will be crucial. You will also be expected to create and maintain API documentation using tools such as Postman or Swagger. As part of your role, you will implement security best practices including input sanitization, rate limiting, secure headers, and CORS configuration. Unit and integration testing using Jest and Supertest will be necessary. Collaboration with frontend developers to deliver seamless API experiences and managing deployments using platforms like Vercel, Render, Railway, DigitalOcean, or AWS (EC2/S3) are also key responsibilities. Your duties will include configuring CI/CD pipelines using GitHub Actions, PM2, or Docker for automated builds and deployments. Handling environment configuration securely using .env files and secret managers will be essential. Working with version control (Git) to manage the codebase, branches, and code reviews is also part of the role. Monitoring and debugging production issues to ensure application reliability and performance will be critical. Additionally, building real-time features using WebSockets or Socket.IO is optional but considered a valuable skill. Your expertise in Node.js, Express.js, TypeScript, PostgreSQL, ORMs like Prisma or TypeORM, authentication, authorization, data validation, file uploads, API documentation, Git, testing, security best practices, CI/CD, deployment experience, environment variable management, and cloud platforms will be instrumental in delivering high-quality backend solutions. Join our photography company based in Noida, operating across India and internationally, specializing in wedding and pre-wedding shoots, maternity and newborn photography, as well as corporate and event coverage. Visit www.theimpressio.com and www.theimpressio.in to learn more about our work. We look forward to welcoming you to our dynamic team of professionals.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Job Description: As the Digital Transformation Lead at Godrej Agrovet Limited (GAVL) in Mumbai, you will play a crucial role in driving innovation and productivity in the agri-business sector. GAVL is dedicated to enhancing the livelihood of Indian farmers by developing sustainable solutions that enhance crop and livestock yields. With leading market positions in Animal Feed, Crop Protection, Oil Palm, Dairy, Poultry, and Processed Foods, GAVL is committed to making a positive impact on the agricultural industry. With an impressive annual sales figure of 6000 Crore INR in FY 18-19, GAVL has a widespread presence across India, offering high-quality feed and nutrition products for cattle, poultry, aqua feed, and specialty feed. The company operates 50 manufacturing facilities, has a network of 10,000 rural distributors/dealers, and employs over 2500 individuals. At GAVL, our people philosophy revolves around the concept of tough love. We set high expectations for our team members, recognizing and rewarding performance and potential through career growth opportunities. We prioritize the development, mentoring, and training of our employees, understanding that diverse interests and passions contribute to a strong team dynamic. We encourage individuals to explore their full potential and provide a supportive environment for personal and professional growth. In this role, you will utilize your expertise as a Data Scientist to extract insights from complex datasets, develop predictive models, and drive data-driven decisions across the organization. You will collaborate with various teams, including business, engineering, and product, to apply advanced statistical methods, machine learning techniques, and domain knowledge to address real-world challenges. Key Responsibilities: - Data Cleaning, Preprocessing & Exploration: Prepare and analyze data, ensuring quality and completeness by addressing missing values, outliers, and data transformations to identify patterns and anomalies. - Machine Learning Model Development: Build, train, and deploy machine learning models using tools like MLflow on the Databricks platform, exploring regression, classification, clustering, and time series analysis techniques. - Model Evaluation & Deployment: Enhance model performance through feature selection, leveraging distributed computing capabilities for efficient processing, and utilizing CI/CD tools for deployment automation. - Collaboration: Work closely with data engineers, analysts, and stakeholders to understand business requirements and translate them into data-driven solutions. - Data Visualization and Reporting: Create visualizations and dashboards to communicate insights to technical and non-technical audiences using tools like Databricks and Power BI. - Continuous Learning: Stay updated on the latest advancements in data science, machine learning, and industry best practices to enhance skills and processes. Required Technical Skills: - Proficiency in statistical analysis, hypothesis testing, and machine learning techniques. - Familiarity with NLP, time series analysis, computer vision, and A/B testing. - Strong knowledge of Databricks, Spark DataFrames, MLlib, and programming languages (Python, TensorFlow, Pandas, scikit-learn, PySpark, NumPy). - Proficient in SQL for data extraction, manipulation, and analysis, along with experience in MLflow and cloud data storage tools. Qualifications: - Education: Bachelors degree in Statistics, Mathematics, Computer Science, or a related field. - Experience: Minimum of 3 years in a data science or analytical role. Join us at Vikhroli, Mumbai, and be a part of our mission to drive digital transformation and innovation in the agricultural sector at Godrej Agrovet Limited.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
As a Lead Engineer, you will be responsible for designing, analyzing, developing, and deploying new features for the product. Your role will involve managing tasks in a sprint, reviewing the code of team members, and ensuring the first-time quality of code. You will actively participate in Agile ceremonies such as sprint planning, story grooming, daily scrums, Retrospective meetings, and Sprint reviews. Your responsibilities will include connecting with stakeholders to understand requirements and producing technical specifications based on business needs. You will write clean, well-designed code and follow technology best practices. Additionally, you will follow a modern agile-based development process, including automated unit testing. You will take complete ownership of tasks and user stories committed by yourself or the team. It is essential to understand and adhere to the development processes agreed upon at the organization/client level, actively participating in optimizing and evolving these processes for improved project execution. Your role will involve understanding user stories, translating them into technical specifications, and converting them into working code. You will troubleshoot, test, and maintain core product software and databases to ensure strong optimization and functionality. Contribution to all phases of the development lifecycle is expected. You should stay updated on industry trends and tools, pilot them, and ensure that the team can scale up technically to adopt best practices over time. Initiative in suggesting and implementing best practices in respective technology areas is encouraged. Expertise in developing Java Framework with RDBMS or NoSQL database back-end is required. Strong skills in Java, Rest, Springboot, and Microservices are essential. Proven expertise in Java 21, Spring boot MVC, Spring data, Hibernate, PostgresSQL, and REST API is necessary. Knowledge in object-oriented concepts & design patterns is essential. Exposure to the AWS ecosystem and services, along with experience in Docker, AWS S3, AWS Secrets Manager, and Cloudwatch is preferred. Understanding Angular concepts and exposure to web and JavaScript technologies will be advantageous. Experience in writing unit test cases using Jasmine/Karma or Jest is a plus. Demonstrated willingness to learn and develop with new/unfamiliar technologies is expected. Understanding the impacts of performance-based designs, accessibility standards, and security compliance in development is crucial. Good knowledge of project tracking tools like JIRA, Azure DevOps, and project collaboration tools like Confluence is required. Excellent communication skills are essential for conveying ideas with clarity, depth, and details. Understanding of Continuous Integration and Continuous Delivery best practices, along with experience in setting up CI/CD using Jenkins, GitHub, and plugins, is beneficial.,
Posted 2 weeks ago
7.0 - 12.0 years
25 - 30 Lacs
Mumbai
Remote
PLEASE APPLY IF YOU CAN JOIN IMMEDIATELY AND HAVE 7+ YRS AWS DATA ENGINEER EXPERIENCE WITH TERRAFORM AND GIT Job Description: We are seeking a skilled Data Engineer with 7+ years of experience in data processing, ETL pipelines, and cloud-based data solutions. The ideal candidate will have strong expertise in AWS Glue, Redshift, S3, EMR, and Lambda , with hands-on experience using Python and PySpark for large-scale data transformations. The candidate will be responsible for designing, building, and maintaining scalable data pipelines and systems to support analytics and data-driven decision-making. Additionally, need to have strong expertise in Terraform and Git-based CI/CD pipelines to support infrastructure automation and configuration management. Key Responsibilities: ETL Development & Automation: Design and implement ETL pipelines using AWS Glue and PySpark to transform raw data into consumable formats. Automate data processing workflows using AWS Lambda and Step Functions. Data Integration & Storage: Integrate and ingest data from various sources into Amazon S3 and Redshift. Optimize Redshift for query performance and cost efficiency. Data Processing & Analytics: Use AWS EMR and PySpark for large-scale data processing and complex transformations. Build and manage data lakes on Amazon S3 for analytics use cases. Monitoring & Optimization: Monitor and troubleshoot data pipelines to ensure high availability and performance. Implement best practices for cost optimization and performance tuning in Redshift, Glue, and EMR. Terraform & Git-based Workflows: Design and implement Terraform modules to provision cloud infrastructure across AWS/Azure/GCP. Manage and optimize CI/CD pipelines using Git-based workflows (e.g., GitHub Actions, GitLab CI, Jenkins, Azure DevOps). Collaborate with developers and cloud architects to automate infrastructure provisioning and deployments. Write reusable and scalable Terraform modules following best practices and code quality standards. Maintain version control, branching strategies, and code promotion processes in Git. Collaboration: Work closely with stakeholders to understand requirements and deliver solutions. Document data workflows, designs, and processes for future reference. Must-Have Skills: Strong proficiency in Python and PySpark for data engineering tasks. Hands-on experience with AWS Glue, Redshift, S3, and EMR . Expertise in building, deploying, and optimizing data pipelines and workflows. Solid understanding of SQL and databas optimization techniques. Strong hands-on experience with Terraform , including writing and managing modules, state files, and workspaces. Proficient in CI/CD pipeline design and maintenance using tools like: GitHub Actions / GitLab CI / Jenkins / Azure DevOps Pipelines Deep understanding of Git workflows (e.g., GitFlow, trunk-based development). Experience in serverless architecture using AWS Lambda for automation and orchestration. Knowledge of data modeling, partitioning, and schema design for data lakes and warehouses. Ability to work 8 PM IST to 4 AM IST (night shift in order to align with customers business hours)
Posted 2 weeks ago
5.0 - 10.0 years
15 - 27 Lacs
Kochi, Thiruvananthapuram
Hybrid
Design, analyze, develop and deploy new features for the product. Lead engineer role in managing tasks in a sprint, reviewing the code of team members, and ensuring first time quality of code. Actively participate in the Agile ceremonies like the sprint planning, story grooming, daily scrums or standup meetings, Retrospective meeting, and Sprint reviews. Connect with stakeholders to understand requirements and produce technical specifications based on the business requirements. Write clean, well-designed code. Follow technology best practices. Follow modern agile based development process including automated unit testing. Take complete ownership of the tasks, user stories committed by self/team. Understand the development processes agreed at the organization/client level and ensure that these are followed diligently in the project. Actively participate in optimizing and evolving this process for the improvement of project execution. Understand user stories, translate that into technical specifications and convert this into working code. Troubleshoot, test, and maintain the core product software and databases to ensure strong optimization and functionality. Contribute to all phases of the development lifecycle. Follow industry trends and tools, pilot them and ensure that team can scale up technically to absorb technology best practices over time. Readiness to take up initiative for suggesting and implementing the best practices in the areas of respective technology. Expertise in developing Java Framework with RDBMS or NoSQL database back-end. Strong skills in Java, Rest, Springboot and Micro services. Proven Expertise in Java 21, Spring boot MVC, Spring data, Hibernate, PostgresSQL. Good working exposure with REST API and very strong knowledge in object-oriented concepts & design patterns. Exposure to AWS ecosystem and services and experience with Docker, AWS S3, AWS Secrets Manager and Cloudwatch. Understanding of Angular concepts – Interceptors, Pipes, Directives, Decorators and exposure to web and JavaScript technologies with HTML/XHTML, XML, JSON, CSS, JavaScript, AJAX, DOM, and version control systems (such as git, visual code) will be an advantage. Experience in writing Unit test case using Jasmine/Karma or Jest is a plus. Demonstrated willingness to learn and develop with new/unfamiliar technologies. Understands impacts of performance-based designs, accessibility standards and security compliance in development. Good understanding and working knowledge on project tracking tools like JIRA, Azure DevOps, and project collaboration tools like Confluence etc. Excellent communication skills and convey the ideas with clarity, depth, and details. Understanding of Continuous Integration and Continuous Delivery best practices, and experience in setting up a CI/CD to speed up software development and deployment process, using Jenkins, GitHub, plugins etc.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
punjab
On-site
Apptunix is a leading Mobile App & Web Solutions development agency based out of Texas, US. We empower cutting-edge startups & enterprise businesses, paving the path for their incremental growth via technology solutions. Our journey began in mid-2013, and since then we have been dedicated to elevating our clients" interests & satisfaction through rendering improved and innovative Software and Mobile development solutions. At Apptunix, we strongly comprehend business needs and implement them by merging advanced technologies with our seamless creativity. Currently, we employ 250+ in-house experts who work closely & dedicatedly with clients to build solutions as per their customers" needs. As a candidate for this position, you are required to possess the following skills: - Deep Experience working on Node.js - Understanding of SQL and NoSQL database systems with their pros and cons - Experience working with databases like MongoDB - Solid Understanding of MVC and stateless APIs & building RESTful APIs - Experience and knowledge of scaling and security considerations - Integration of user-facing elements developed by front-end developers with server-side logic - Good experience with ExpressJs, MongoDB, AWS S3, and ES6 - Writing reusable, testable, and efficient code - Design and implementation of low-latency, high-availability, and performance applications - Implementation of security and data protection - Integration of data storage solutions and Database structure - Good experience in Nextjs, Microservices, RabbitMQ, Sockets We are looking for candidates with 5-8 years of relevant experience. This is a full-time position with a work schedule from Monday to Friday. The work location is in person. If you meet the requirements and are enthusiastic about joining a dynamic team focused on delivering top-notch solutions, we encourage you to apply for this exciting opportunity.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Senior Lead Engineer specializing in Python and Spark within AWS, you will be tasked with designing, building, and maintaining robust, scalable, and efficient ETL pipelines. Your primary focus will be ensuring alignment with the data lakehouse architecture on AWS. You will leverage your extensive expertise in Python and Spark to develop and optimize workflows using AWS services such as Glue, Glue Data Catalog, Lambda, and S3. In this role, you will implement data quality and governance frameworks to guarantee reliable and consistent data processing across the platform. Collaborating with cross-functional teams, you will gather requirements, provide technical insights, and deliver high-quality data solutions. Your responsibilities will also include driving the migration of existing data processing workflows to the lakehouse architecture by leveraging Iceberg capabilities. As a key member of the team, you will establish and enforce best practices for coding standards, design patterns, and system architecture. Monitoring and improving system performance and data reliability through proactive analysis and optimization techniques will be essential. Additionally, you will lead technical discussions, mentor team members, and promote a culture of continuous learning and innovation. Your interactions will primarily involve senior management and the architectural group, development managers and team leads, data engineers and analysts, as well as agile team members. Therefore, excellent interpersonal skills, both verbal and written, will be crucial in articulating complex technical solutions to diverse audiences. To excel in this role, you must possess a consistent track record of designing and implementing complex data processing workflows using Python and Spark. Strong experience with AWS services such as Glue, Glue Data Catalog, Lambda, S3, and EMR is essential, with a focus on data lakehouse solutions. Deep understanding of data quality frameworks, data contracts, and governance standard processes will also be required. Furthermore, the ability to design and implement scalable, maintainable, and secure architectures using modern data technologies is crucial. Hands-on experience with Apache Iceberg and its integration within data lakehouse environments, along with expertise in problem-solving and performance optimization for data workflows, will be key skills for success in this role. Desirable skills include familiarity with additional programming languages such as Java, experience with serverless computing paradigms, and knowledge of data visualization or reporting tools for stakeholder communication. Certification in AWS or data engineering (e.g., AWS Certified Data Analytics, Certified Spark Developer) would be advantageous. A bachelor's degree in Computer Science, Software Engineering, or a related field is helpful, although equivalent professional experience or certifications will also be considered. By joining our dynamic organization at LSEG, you will have the opportunity to contribute to driving financial stability, empowering economies, and enabling sustainable growth, all while being part of a collaborative and creative culture that values diversity and sustainability.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
The role involves developing and maintaining scalable server-side applications using Node.js, Express.js, and TypeScript. You will be responsible for designing robust and secure RESTful APIs with proper routing, middleware, and error handling. Additionally, building and optimizing relational database schemas using PostgreSQL to ensure performance, normalization, and data integrity will be a key part of your responsibilities. You will integrate and manage ORMs like Prisma or TypeORM for efficient and type-safe database operations, implement authentication and authorization using JWT, session-based methods, and OAuth protocols, and validate request and response data using Zod or Joi to ensure type safety and data integrity. Handling file uploads and media storage using Multer, and integrating with services like Cloudinary, AWS S3, or similar platforms will also be required. Writing clean, testable, and modular code following SOLID principles, creating and maintaining API documentation using tools like Postman or Swagger, and implementing security best practices such as input sanitization, rate limiting, secure headers, and CORS configuration are crucial tasks. You will also perform unit and integration testing using Jest and Supertest, collaborate closely with frontend developers to define and deliver seamless API experiences, and manage deployments using platforms like Vercel, Render, Railway, DigitalOcean, or AWS (EC2/S3). Configuring CI/CD pipelines using GitHub Actions, PM2, or Docker for automated builds and deployments, handling environment configuration securely using .env files and secret managers, working with version control (Git) to manage codebase, branches, and code reviews, monitoring and debugging production issues to ensure application reliability and performance, and building real-time features using WebSockets or Socket.IO are additional responsibilities. The ideal candidate should have expertise in Node.js, Express.js, TypeScript, PostgreSQL, ORMs like Prisma or TypeORM, and authentication methods such as JWT, session-based auth, and OAuth. Knowledge of MongoDB, WebSockets, API documentation tools, Git, basic testing frameworks, security best practices, CI/CD pipelines, deployment platforms, environment variable management, cloud platforms, clean code practices, and strong debugging skills is desired. About Company: The company is a photography firm based in Noida, operating across India and internationally. Their primary services include wedding and pre-wedding shoots, maternity photoshoots, newborn photography, birthday and pre-birthday shoots, as well as corporate and event coverage. To learn more about their work, visit www.theimpressio.com and www.theimpressio.in.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and maintaining SQL Server Analysis Services (SSAS) models to support business intelligence reporting. Your role will involve creating and managing OLAP cubes, as well as developing and implementing multidimensional and tabular data models. You will be tasked with optimizing the performance of SSAS solutions for efficient query processing. Additionally, you will be required to integrate data from various sources into SQL Server databases and SSAS models. Knowledge of AWS S3 and SQL Server Polybase is preferred. The ideal candidate should have 5 to 8 years of experience in SQL development with expertise in SSAS and OLAP. This position is based in Pan India.,
Posted 2 weeks ago
4.0 - 6.0 years
4 - 9 Lacs
Kolkata, Pune, Chennai
Work from Office
We are seeking an experienced Python Developer with strong hands-on expertise in AWS cloud services and data libraries. The ideal candidate will be proficient in designing and deploying applications using Python, AWS (Lambda, EC2, S3), and familiar with DevOps tools such as GitLab. Experience with NumPy and Pandas for data processing or ML-related tasks is essential.
Posted 3 weeks ago
4.0 - 6.0 years
6 - 15 Lacs
Pune
Work from Office
Leading the day-to-day technical operations, providing highest levels of availability, reliability, and scalability of the Services System-level debugging, incident response, automation, and infrastructure observability. Exp working in AWS cloud Required Candidate profile strong experience - in Linux/Unix environments. - Docker / Kubernetes. - Python or Bash scripting - deploying, optimizing, and troubleshooting applications --monitoring & logging tools Newrelic ,ELK
Posted 3 weeks ago
6.0 - 12.0 years
0 Lacs
karnataka
On-site
Your role as a Supervisor at Koch Global Services India (KGSI) will involve being part of a global team dedicated to creating new solutions and enhancing existing ones for Koch Industries. With over 120,000 employees worldwide, Koch Industries is a privately held organization engaged in manufacturing, trading, and investments. KGSI is being established in India to expand its IT operations and serve as an innovation hub within the IT function. This position offers the chance to join at the inception of KGSI and play a pivotal role in its development over the coming years. You will collaborate closely with international colleagues, providing valuable global exposure to the team. In this role, you will lead a team responsible for developing innovative solutions for KGS and its customers. You will oversee the performance and growth of data engineers at KGSI, ensuring the delivery of application solutions. Collaboration with global counterparts will be essential for enterprise-wide delivery success. Your responsibilities will include mentoring team members, providing feedback, and coaching them for their professional growth. Additionally, you will focus on understanding individual career aspirations, addressing challenges, and facilitating relevant training opportunities. Ensuring compensation aligns with Koch's philosophy and maintaining effective communication with HR will be key aspects of your role. Timely delivery of projects is crucial, and you will be responsible for identifying and addressing delays proactively. By fostering knowledge sharing and best practices within the team, you will contribute to the overall success of KGSI. Staying updated on market trends, talent acquisition, and talent retention strategies will be vital for your role. Your ability to lead by example, communicate effectively, and solve problems collaboratively will be essential in driving team success. To qualify for this role, you should hold a Bachelor's or Master's degree in computer science or information technology with a minimum of 12 years of IT experience, including leadership roles in integration teams. A solid background in data engineering, AWS cloud migration, and team management is required. Strong communication skills, customer focus, and a proactive mindset towards innovation are essential for success in this position. Experience with AWS Lambda, Glue, ETL projects, Python, SQL, and BI tools will be advantageous. Familiarity with manufacturing business processes and exposure to Scrum Master practices would be considered a plus. Join Koch Global Services (KGS) to be part of a dynamic team that creates solutions to support various business functions worldwide. With a global presence in India, Mexico, Poland, and the United States, KGS empowers employees to make a significant impact on a global scale.,
Posted 3 weeks ago
5.0 - 10.0 years
20 - 27 Lacs
Pune
Remote
Work Hours: Partial overlap with US PST Key Responsibilities Rapidly prototype MVPs and innovative ideas within tight timelines. Own end-to-end development and deployment of applications in non-production AWS environments. Collaborate with cross-functional teams to deliver scalable web solutions. Technical Expertise 1. Front-End Development Proficiency in React, AWS S3, and AWS CloudFront. Experience building medium to large websites (1520+ pages). 2. Back-End & Serverless Architecture Strong understanding of microservices architecture. Experience with AWS serverless stack: Lambda (Node.js), Cognito, API Gateway, EventBridge, Step Functions. Familiarity with AWS Aurora MySQL and DynamoDB (preferred but not mandatory). 3. DevOps & CI/CD Proficiency in AWS SAM, CloudFormation or AWS CDK. Experience with AWS CodePipeline or equivalent tools (e.g., GitHub Actions). Requirements Experience & Qualifications 5–10 years of software development experience. Minimum 3 years of hands-on experience with React and AWS technologies. Fast learner with the ability to adapt in a dynamic environment.
Posted 3 weeks ago
4.0 - 6.0 years
10 - 13 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Role & responsibilities Data tester (functional testing) Automation experience preferably in Python/Pyspark Working knowledge of No-SQL preferably MongoDb (or json format) Working knowledge of AWS S3 Working knowledge of JIRA /confluence and defect management Understands Agile ways of working Years of experience : minimum 3 years Preferred candidate profile
Posted 4 weeks ago
7.0 - 12.0 years
25 - 40 Lacs
Bengaluru
Hybrid
Role - Data Engineer Experience - 7+ Years Notice - Immediate Skills - AWS (S3, Glue, Lambda, EC2), Spark, Pyspark, Python, Airflow
Posted 1 month ago
6.0 - 8.0 years
5 - 6 Lacs
Navi Mumbai, SBI Belapur
Work from Office
6-8 Years Relevant Years of Experience : 6-8 Years Mandatory Skills : Oracle Golden Gate SME Detailed JD : 1.Oracle GoldenGate Architecture & Implementation Design and implement high-availability and fault-tolerant GoldenGate solutions across multiple environments (on-prem, cloud, hybrid).Install, configure, and optimize Oracle GoldenGate 21c/23ai for heterogeneous databases (Oracle, MSSQL, MySQL, PostgreSQL).Set up Extract, Pump, and Replicat processes for source-to-target data replication.Implement downstream mining for redo log-based replication in Active Data Guard environments.Configure GoldenGate for Big Data to integrate with platforms like Kafka, and AWS S3. 2. Performance Tuning & Optimization Fine-tune GoldenGate Extract, Pump, and Replicat processes to handle high-transaction loads (~20TB logs/day).Optimize ACFS, XAG, and RAC configurations for high availability.Implement multi-threaded Replicat (MTR) for parallel processing and improved performance.Configure compression techniques for efficient data transfer between hubs. 3. Monitoring & Troubleshooting Set up OEM GoldenGate Plugin for real-time monitoring of replication health and performance.Troubleshoot latency issues, data integrity errors, and replication lag. Monitor Kafka offsets to ensure efficient data consumption by downstream systems.Validate data integrity using Oracle Veridata or manual comparison techniques. 4. Cloud & Big Data Integration Implement Oracle GoldenGate for Big Data (OGG-BD) for streaming replication to Kafka and AWS S3.Design data lakehouse architectures for real-time data ingestion into cloud platforms.Configure Parquet and Avro file formats for efficient storage in AWS S3. 5. Security & Compliance Implement TLS encryption and secure log transport between source, downstream, and target systems.Ensure compliance with enterprise data governance policies.
Posted 1 month ago
0.0 - 3.0 years
3 - 7 Lacs
Thane
Hybrid
Responsibilities: Provide first-line and second-line technical support to customers via email, phone, or chat. Diagnose and resolve software issues, bugs, and technical queries efficiently and effectively. Create and maintain knowledge base articles, documentation, and FAQs for both internal and customer use. Assist with system monitoring and performance tuning to ensure software stability. Assist customers in product feature usage, configurations, and best practices. Provide training to end-users or internal teams on new features and functionalities. Log and track incidents in the support management system , ensuring that all issues are addressed promptly. Stay up-to-date with the latest software releases, patches, and updates.
Posted 1 month ago
7.0 - 9.0 years
27 - 30 Lacs
Bengaluru
Work from Office
We are seeking experienced Data Engineers with over 7 years of experience to join our team at Intuit. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 7+ years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Interested can also apply with Sanjeevan Natarajan - 94866 21923 sanjeevan.natarajan@careernet.in Role & responsibilities Technical Leadership Lead a team of data engineers and developers; define technical strategy, best practices, and architecture for data platforms. End-to-End Solution Ownership Architect, develop, and manage scalable, secure, and high-performing data solutions on AWS and Databricks. Data Pipeline Strategy Oversee the design and development of robust data pipelines for ingestion, transformation, and storage of large-scale datasets. Data Governance & Quality Enforce data validation, lineage, and quality checks across the data lifecycle. Define standards for metadata, cataloging, and governance. Orchestration & Automation Design automated workflows using Airflow, Databricks Jobs/APIs, and other orchestration tools for end-to-end data operations. Cloud Cost & Performance Optimization Implement performance tuning strategies, cost optimization best practices, and efficient cluster configurations on AWS/Databricks. Security & Compliance Define and enforce data security standards, IAM policies, and compliance with industry-specific regulatory frameworks. Collaboration & Stakeholder Engagement Work closely with business users, analysts, and data scientists to translate requirements into scalable technical solutions. Migration Leadership Drive strategic data migrations from on-prem/legacy systems to cloud-native platforms with minimal risk and downtime. Mentorship & Growth Mentor junior engineers, contribute to talent development, and ensure continuous learning within the team. Preferred candidate profile Python , SQL , PySpark , Databricks , AWS (Mandatory) Leadership Experience in Data Engineering/Architecture Added Advantage: Experience in Life Sciences / Pharma
Posted 1 month ago
5.0 - 10.0 years
40 - 45 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Notice: - Immediate Joiners Only Design, develop, and maintain SQL Server Analysis Services (SSAS) models Create and manage OLAP cubes to support business intelligence reporting Develop and implement multidimensional and tabular data models Optimize the performance of SSAS solutions for efficient query processing Integrate data from various sources into SQL Server databases and SSAS models Preferably knowledge on AWS S3 and SQL server Polybase Location - Bangalore, Pune, Gurgaon, Noida, Hyderabad
Posted 1 month ago
2.0 - 7.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough