Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
15 - 30 Lacs
Chandigarh
Work from Office
Job Description: We are looking for a skilled Node.js Developer with strong experience in React.js to join our development team. As a full-stack developer, you will work on developing and maintaining the backend using Node.js, while also working on the frontend with React.js to deliver high-quality web applications. Key Responsibilities: Develop and maintain scalable, efficient, and high-performance server-side applications using Node.js. Build interactive and dynamic web applications using React.js and integrate them with backend APIs. Design and implement RESTful APIs and services. Work closely with the UX/UI team to create seamless user interfaces. Optimize web applications for maximum speed and scalability. Troubleshoot, debug, and maintain codebase to improve application functionality. Participate in code reviews and contribute to best practices in software development. Work collaboratively in an agile environment with product managers, designers, and other developers. Required Skills & Qualifications: Proven experience as a Node.js Developer with React.js expertise. Strong proficiency in JavaScript (ES6+), Node.js, and React.js. Familiarity with frontend technologies like HTML5, CSS3, and JavaScript frameworks. Experience with server-side frameworks such as Express.js or Koa.js. Familiarity with databases such as MongoDB, MySQL, or PostgreSQL. Knowledge of version control systems like Git. Experience with RESTful API design and third-party API integration. Strong understanding of asynchronous programming and handling concurrency. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Experience with cloud services (AWS, Azure, Google Cloud). Knowledge of CI/CD pipelines and containerization (Docker). Experience with TypeScript. Familiarity with Agile/Scrum methodologies. Soft Skills: Strong communication skills and ability to work in a team-oriented environment. Ability to quickly learn new technologies and adapt to changing environments. Strong attention to detail and ability to troubleshoot complex issues.
Posted 1 month ago
2.0 - 4.0 years
6 - 7 Lacs
Chennai, Bengaluru
Work from Office
Develop and maintain machine learning models, with a focus on computer vision algorithms. Implement image processing techniques such as object detection, segmentation, and image classification using Convolutional Neural Networks (CNNs). Work with TensorFlow or PyTorch to develop, train, and deploy vision models. Utilize OpenCV for tasks like feature extraction, image filtering, and object tracking. Collaborate with cross-functional teams to enhance AI-driven solutions in the mobile coding domain. Experience with Keras for quick prototyping of deep learning models. Familiarity with Large Language Models (LLMs) and their applications in AI. Knowledge of cloud platforms such as . Location : Remote/ chennai/ Bangalore
Posted 1 month ago
1.0 - 3.0 years
2 - 3 Lacs
Surat
Work from Office
Were Hiring! IT Administrator – Surat, India Company : Florida Steel Frame and Truss Mfg. LLC (FSFTM) Location : Surat, Gujarat – India Office Apply at : isha.t@fsftm.com Are you a tech-savvy problem solver with a passion for IT infrastructure, cloud solutions, and modern collaboration tools? We are looking for an IT Administrator who can manage and enhance our technology ecosystem at our growing Surat office. Key Skills Required : Proficiency in Cloud Services (Google Cloud) Hands-on experience with Google Workspace Administration Admin-level skills in Zoho Suite and Zoom Strong knowledge of networking , IT security , and data backups Familiarity with AI tools and automation workflows Troubleshooting hardware/software issues with proactive problem-solving Experience managing SaaS platforms and remote tools Excellent communication and coordination skills Engineering or Computer Science background is a plus Job Responsibilities : Oversee day-to-day IT operations at the Surat office Administer and support Google Workspace , Zoho tools , Zoom , and other internal software Monitor and manage cloud services, ensuring uptime and data security Maintain systems and troubleshoot technical issues for both on-site and remote users Implement automation tools and explore AI-based solutions to improve workflows Provide IT support during onboarding/offboarding of employees Ensure data backups, antivirus protection, and compliance with company IT policies Coordinate with U.S. headquarters and vendors for procurement and tech upgrades Qualifications : Bachelor’s degree in Computer Engineering / IT / Computer Applications Minimum 2–4 years of IT administration experience Certifications in cloud platforms or security (preferred)
Posted 1 month ago
5.0 - 8.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to lead DevOps team to facilitate better coordination among operations, development and testing functions by automating and streamlining the integration and deployment processes Do Drive technical solution support to the team to align on continuous integration (CI) and continuous deployment (CD) of technology in applications Design and define the overall DevOps architecture/ framework to for a project/ module delivery as per the client requirement Decide on the DevOps tool & platform and which needs to be deployed aligned to the customers requirement Create a tool deployment model for validating, testing and monitoring performance and align or provision for resources accordingly Define & manage the IT infrastructure as per the requirement of the supported software code Manage and drive the DevOps pipeline that supports the application life cycle across the DevOps toolchain from planning, coding and building, to testing, to staging, to release, configuration and monitoring Work with the team to tackle the coding and scripting needed to connect elements of the code that are required to run the software release with operating systems and production infrastructure with minimum disruptions Ensure on boarding application configuration from planning to release stage Integrate security in the entire dev-ops lifecycle to ensure no cyber risk and data privacy is maintained Provide customer support/ service on the DevOps tools Timely support internal & external customers escalations on multiple platforms Troubleshoot the various problems that arise in implementation of DevOps tools across the project/ module Perform root cause analysis of major incidents/ critical issues which may hamper project timeliness, quality or cost Develop alternate plans/ solutions to be implemented as per root cause analysis of critical problems Follow escalation matrix/ process as soon as a resolution gets complicated or isnt resolved Provide knowledge transfer, sharing best practices with the team and motivate Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Incase of performance issues, take necessary action with zero tolerance for will based performance issues Ensure that organizational programs like Performance Nxtarewell understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring 100% error free on boarding & implementation 2. CSAT Manage service tools Troubleshoot queries Customer experience 3. Capability Building & Team Management % trained on new age skills, Team attrition %, Employee satisfaction score Mandatory Skills: Google Cloud DevOps.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Kochi
Work from Office
Design and optimize enterprise database systems to ensure scalability and security. Oversee database architecture, performance tuning, and cloud-based DBMS implementation. Must have expertise in SQL, NoSQL, and database optimization techniques. Collaborate with developers and data engineers to maintain high-performance data pipelines. Strong knowledge of AWS, Azure, or Google Cloud is required. Experience in managing large-scale database environments is preferred.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Guwahati
Work from Office
Design and optimize enterprise database systems to ensure scalability and security. Oversee database architecture, performance tuning, and cloud-based DBMS implementation. Must have expertise in SQL, NoSQL, and database optimization techniques. Collaborate with developers and data engineers to maintain high-performance data pipelines. Strong knowledge of AWS, Azure, or Google Cloud is required. Experience in managing large-scale database environments is preferred.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Kanpur
Work from Office
Design and optimize enterprise database systems to ensure scalability and security. Oversee database architecture, performance tuning, and cloud-based DBMS implementation. Must have expertise in SQL, NoSQL, and database optimization techniques. Collaborate with developers and data engineers to maintain high-performance data pipelines. Strong knowledge of AWS, Azure, or Google Cloud is required. Experience in managing large-scale database environments is preferred.
Posted 1 month ago
9.0 years
0 - 0 Lacs
Hyderabad
Work from Office
Mandatory Skill Set : Expertise in programming languages like Java, JavaScript or Python Strong Experience in UI and Mobile automation using Java, Selenium, Test NG, Appium Experience in creating test automation frameworks at both a frontend and backend layer. Strong expertise in SQL and test data management Extensive experience in CI/CD pipelines using Jenkins, GitLab CI, ADO, or similar. In-depth understanding of DevOps practices and tools, including containerisation and orchestration with tools like Docker and Kubernetes. Experience in working with cloud technologies like AWS, Azure, Google Cloud and Mobile cloud platforms like Headspin, Browserstack etc., Good to have / Optional: Mastery of performance testing tools like Apache JMeter or k6 and analysing system performance Proficiency in security testing tools like OWASP ZAP or Burp Suite.
Posted 1 month ago
4.0 - 9.0 years
9 - 19 Lacs
Hyderabad, Bengaluru
Work from Office
Key Responsibilities - Python & PySpark: - Writing efficient ETL (Extract, Transform, Load) pipelines. - Implementing data transformations using PySpark DataFrames and RDDs. - Optimizing Spark jobs for performance and scalability. - Apache Spark: - Managing distributed data processing. - Implementing batch and streaming data processing. - Tuning Spark configurations for efficient resource utilization. - Unix Shell Scripting: - Automating data workflows and job scheduling. - Writing shell scripts for file management and log processing. - Managing cron jobs for scheduled tasks. - Google Cloud Platform (GCP) & BigQuery: - Designing data warehouse solutions using BigQuery. - Writing optimized SQL queries for analytics. - Integrating Spark with BigQuery for large-scale data processing
Posted 1 month ago
4.0 - 9.0 years
8 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Role & responsibilities Job Description: We are looking for an independent contributor experienced in Data Engineering space. Primary responsibilities include implementation of large-scale data processing (Structural, Statistical etc.,) Pipelines, creates production inference pipelines, associated APIs and analytics that support/provide insights for data driven decision making. Designs and develops data models, APIs, and pipelines to handle analytical workloads, data sharing, and movement across multiple systems at various grains in a large-scale data processing environment. Designs and maintains data systems and data structures for optimal read/write performance. Implements machine learning or statistical/heuristic learning in data pipelines based on input from Data Scientists. Roles and Responsibilities: Work in data streaming, movement, data modelling and data pipeline development Develop pipelines and data model changes in support of rapidly emerging business and project requirements Develop code and maintain systems to support analytics Infrastructure & Data Lake Partner/Contribute to data analysis and machine learning pipelines Design data recovery processes, alternate pipelines to check data quality. Create and maintain continuous data quality evaluation processes Optimize performance of the analytics platform and develop self-healing workflows Be a part of a global team and collaborate and co-develop solutions Qualifying Criteria: Bachelors degree in computer science, information technology, or engineering 5+ Years of prior experience in Data Engineering and Databases Experience with code based ETL framework like Airflow/Prefect Experience with Google Big Query, Google Pub Sub, Google Dataflow Experience building data pipelines on AWS or GCP Experience developing data APIs and pipelines using Python Experience with databases like MySQL/Postgres Experience with intermediate Python programming Experience with advanced SQL (analytical queries) "" Preferred Qualifications: Experience with Visualization tools like Tableau/QlikView/Looker Experience with building Machine Learning pipelines. Mandatory Skills: Data Engineering, Python, Airflow, AWS/ Google Cloud / GCP, Data Streaming, Data Lake, Data Pipelines, Google, Bigquerry, ETL, Google Pub sub, Google Data Flow, Rest API, MySQL, Postgre, SQL Analytics
Posted 1 month ago
5.0 - 10.0 years
15 - 22 Lacs
Hyderabad
Work from Office
data engineering, data management, Snowflake , SQL, data modeling, and cloud-native data architecture AWS, Azure, or Google Cloud (Snowflake on cloud platforms) ETL tools such as Informatica, Talend, or dbt Python or Shell scripting
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Varanasi
Work from Office
Design and optimize enterprise database systems to ensure scalability and security. Oversee database architecture, performance tuning, and cloud-based DBMS implementation. Must have expertise in SQL, NoSQL, and database optimization techniques. Collaborate with developers and data engineers to maintain high-performance data pipelines. Strong knowledge of AWS, Azure, or Google Cloud is required. Experience in managing large-scale database environments is preferred.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Faridabad
Work from Office
Design and optimize enterprise database systems to ensure scalability and security Oversee database architecture, performance tuning, and cloud-based DBMS implementation Must have expertise in SQL, NoSQL, and database optimization techniques Collaborate with developers and data engineers to maintain high-performance data pipelines Strong knowledge of AWS, Azure, or Google Cloud is required Experience in managing large-scale database environments is preferred
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Vadodara
Work from Office
Design and optimize enterprise database systems to ensure scalability and security Oversee database architecture, performance tuning, and cloud-based DBMS implementation Must have expertise in SQL, NoSQL, and database optimization techniques Collaborate with developers and data engineers to maintain high-performance data pipelines Strong knowledge of AWS, Azure, or Google Cloud is required Experience in managing large-scale database environments is preferred
Posted 1 month ago
4.0 - 7.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Duration: 12 Months Work Type: Onsite Job Description: The Marketing Engineering organization in Strategy, Planning & Operations is looking for a Data Engineer eager to provide the engineering muscle for core engineering work around data architecture, data pipeline development, and deploying production workflows to continue improving our data & reporting needs for Marketing in an AI future. You will be responsible for: Working with a technical leader to lead major engineering initiatives to support decision-making by the business You will help manage production data workflows, ensuring timely updates are made through a standard SDLC process. You will help design and develop new architectures and workflows to support the growing business needs of the client'sMarketing. Who You'll Work With You will report to a manager within Advanced Analytics and work directly with a Team Leader in the Engineering Center of Excellence You will work closely with Modelers, Business Stakeholders, Analytics Translators as well as IT Who you are: You like working with Data and love getting the business to adopt Data Insights. Minimum qualifications are: Experience with data modeling, data warehousing, and building ETL pipelines Experience in SQL, Python, and Unix/Bash scripting Experience working with Docker, Git, and various cloud platforms like Google Cloud and Azure. Writing complex, optimized SQL queries across large data sets Experience in communicating complex technical concepts to a broad variety of audiences Proven success in communicating with users, other technical teams, and senior management to collect requirements, describe data modeling decisions and data engineering strategy. Preferred Qualifications: BS/MS in Computer Science/Engineering. 4-7 years of experience in a Data Engineering or similar role Knowledge of software engineering standard methodologies across development lifecycles including agile methodologies, coding standards, code reviews, source management, build processes, testing and operations.
Posted 1 month ago
5.0 - 8.0 years
10 - 20 Lacs
Chennai
Work from Office
About the Role: We are seeking a skilled RPA Developer with expertise in UI Path and strong backend development experience in Java, Python, and .NET. The ideal candidate will have a solid understanding of robotic process automation (RPA) development, along with the technical expertise to integrate backend technologies to enhance automation processes. You will be responsible for developing, designing, and maintaining RPA workflows that integrate with multiple systems using backend technologies. Key Responsibilities: RPA Development: Design, develop, and deploy RPA solutions using UI Path, automating repetitive processes, improving operational efficiency, and ensuring accuracy. Backend Integration: Collaborate with backend development teams to integrate RPA solutions with existing backend technologies such as Java, Python, and .NET to improve system automation and data flow. Process Analysis: Analyze business processes, identify automation opportunities, and create detailed workflows to automate tasks while ensuring the scalability and security of the automation. Solution Design & Architecture: Design and implement RPA solutions in alignment with business goals and IT strategy, ensuring seamless integration with enterprise systems and applications. Coding & Scripting: Write and maintain backend code in Java, Python, and .NET to support RPA functionality, including building custom APIs, handling data transformations, and ensuring smooth communication between systems. Testing & Debugging: Ensure thorough testing, troubleshooting, and debugging of RPA solutions and backend integrations to ensure stability and reliability. Maintenance & Support: Provide ongoing support and maintenance for existing RPA workflows, including enhancements, bug fixes, and optimizations. Collaboration & Documentation: Collaborate with cross-functional teams (Business Analysts, IT, and Operations) to gather requirements and translate them into technical solutions. Document RPA workflows, backend integration details, and system architecture. Key Skills & Qualifications: RPA Development: Hands-on experience with UI Path RPA platform, including development, deployment, and maintenance of automation processes. Programming Skills: o Strong proficiency in Java, Python, and .NET for backend integration and development of custom components. o Experience with RESTful APIs, web services, and database connectivity (SQL, NoSQL) to facilitate backend integration. Experience with Automation Tools: Familiarity with additional RPA tools is a plus (e.g., Automation Anywhere, Blue Prism). Backend Technologies: Deep understanding of backend technologies and frameworks in Java, Python, and .NET to build integrations, perform data transformations, and optimize system performance. Education & Experience: Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Minimum 5-8 years of experience working with RPA (UI Path), Java, Python, and .NET. Proven experience in developing backend solutions and integrating with RPA tools. Desired Skills (Optional): Certifications in UI Path RPA. Knowledge of cloud platforms like AWS, Azure, or Google Cloud is a plus. Familiarity with DevOps practices and CI/CD pipelines.
Posted 1 month ago
8.0 - 10.0 years
10 - 12 Lacs
Mumbai
Work from Office
Job Title: Informatica CDQ Lead (Contract - Work From Office, Mumbai) We have an immediate requirement for an experienced Informatica CDQ Lead for a contract position in Mumbai. The ideal candidate should have expertise in Informatica Cloud Data Quality (CDQ) and strong data management skills. Key Responsibilities: Lead Informatica CDQ implementations for multiple projects. Develop and maintain data quality rules, scorecards, and dashboards. Collaborate with data architects and stewards to define solutions. Identify and resolve data quality issues to ensure accuracy. Work with business teams to understand requirements and provide solutions. Conduct root cause analysis and recommend improvements. Provide support and guidance to team members. Monitor and manage data quality performance with reports. Required Skills and Qualifications: 6+ years of experience with Informatica CDQ. Expertise in data quality solutions and governance. Strong knowledge of ETL processes and Informatica CDQ rules. Experience in leading data quality initiatives and teams. Familiarity with AWS, Azure, or Google Cloud is a plus. Excellent problem-solving and leadership skills. Location: Mumbai (Work From Office) To Apply: Please share your resume with: CTC Expected CTC Location (Mumbai) Notice Period (Immediate preferred) Email: navaneetha@suzva.com
Posted 1 month ago
4.0 - 5.0 years
4 - 8 Lacs
Noida, Surat, Sector 62
Work from Office
As a Data Analytics Trainer, you will be responsible for delivering engaging and informative training on various data analytics concepts and tools. You will guide learners through the process of analyzing, visualizing, and interpreting data, ensuring they develop the skills necessary to make data-driven decisions. Preference will be given to candidates with Data Science experience, including knowledge of machine learning and advanced data analysis techniques. Key Responsibilities: Training Delivery: Conduct interactive training sessions on topics including data collection, cleaning, analysis, and visualization using tools like Excel, SQL, Python, R, Tableau, Power BI, etc. Curriculum Development: Develop and update training materials and exercises that align with current industry trends and best practices. Learner Support: Offer guidance, feedback, and mentorship to help learners master key concepts and apply them effectively in their careers. Advanced Topics (Preferred): Provide optional or advanced training on machine learning, statistical modeling, and data science topics such as Python libraries (Pandas, NumPy, Scikit-learn) and TensorFlow. Assessment and Evaluation: Design and administer quizzes, tests, and assignments to track learner progress and provide feedback for improvement. Professional Development: Stay updated on the latest trends in data analytics and data science to improve your teaching and course content. Collaboration: Work closely with the training team to ensure seamless course delivery and contribute to the development of new programs. Qualifications: Educational Background: Bachelor's or Masters degree in Computer Science, Data Science, Mathematics, Statistics, or a related field. Experience: 2+ years of experience in data analytics. Experience in teaching or training is highly preferred. Data Science experience is a bonus! Technical Skills: Proficiency in tools such as Excel, SQL, Python, R, Tableau, Power BI, and machine learning libraries (e.g., Scikit-learn, TensorFlow). Strong Communication Skills: Ability to explain complex technical concepts clearly and engage learners. Teaching and Mentoring Skills: Strong presentation and mentoring abilities to help students understand and apply data analytics concepts. Certifications (Preferred): Data analytics or data science certifications (e.g., Google Data Analytics, Microsoft Certified: Data Analyst Associate) are a plus. Additional Preferred Skills: Experience with advanced analytics tools (Hadoop, Spark) or cloud platforms (AWS, Google Cloud). Ability to deliver both in-person and remote training. Familiarity with business applications of data analytics in industries such as healthcare, finance, and marketing.
Posted 1 month ago
7.0 - 10.0 years
20 - 25 Lacs
Chennai
Work from Office
Job Description A US based IT consulting and product firm, for its Chennai office is currently looking for a Senior Developer with expertise in React and .NET. Responsibilities include leading a development team, conducting code reviews, managing deliverables, and task allocation. Must be proficient in Agile methodologies, process-oriented, and have a strong understanding of mobile development. Ideal candidates should excel in team leadership and efficient project execution. Desired Candidate Profile Experienced in C#, .Net development Experience in React, .NET 6/7/8, .NET Framework 4.8, Web Development, Redux Hands-on experience in MSSQL Server and/or other relational database skills Ability to write and execute unit test cases Experience in Azure/AWS is an added advantage Google Cloud experience would be an additional plus Responsive design principles, CI/CD principles, Agile/Scrum methodologies, Problem-solving and critical thinking, Communication and teamwork abilities
Posted 1 month ago
6.0 - 10.0 years
10 - 15 Lacs
Chennai
Work from Office
Experience : 5-10 years in ETL development, with 3+ years in a leadership role and extensive hands-on experience in Informatica PowerCenter and Cloud Data Integration. Job Overview: We are seeking a highly skilled and experienced Informatica Lead to join our IT team. The ideal candidate will lead a team of ETL developers and oversee the design, development, and implementation of ETL solutions using Informatica PowerCenter and Cloud Data Integration. This role requires expertise in data integration, leadership skills, and the ability to work in a dynamic environment to deliver robust data solutions for business needs. Key Responsibilities: ETL Development and Maintenance: Lead the design, development, and maintenance of ETL workflows and mappings using Informatica PowerCenter and Cloud Data Integration. Ensure the reliability, scalability, and performance of ETL solutions to meet business requirements. Optimize ETL processes for data integration, transformation, and loading into data warehouses and other target systems. Solution Architecture and Implementation: Collaborate with architects and business stakeholders to define ETL solutions and data integration strategies. Develop and implement best practices for ETL design and development. Ensure seamless integration with on-premises and cloud-based data platforms. Data Governance and Quality: Establish and enforce data quality standards and validation processes. Implement data governance and compliance policies to ensure data integrity and security. Perform root cause analysis and resolve data issues proactively. Team Leadership: Manage, mentor, and provide technical guidance to a team of ETL developers. Delegate tasks effectively and ensure timely delivery of projects and milestones. Conduct regular code reviews and performance evaluations for team members. Automation and Optimization: Develop scripts and frameworks to automate repetitive ETL tasks. Implement performance tuning for ETL pipelines and database queries. Explore opportunities to improve efficiency and streamline workflows. Collaboration and Stakeholder Engagement: Work closely with business analysts, data scientists, and application developers to understand data requirements and deliver solutions. Communicate project updates, challenges, and solutions to stakeholders effectively. Act as the primary point of contact for Informatica-related projects and initiatives. Academic Qualifications: Bachelor's degree in Computer Science, Information Technology, or equivalent. Relevant certifications (e.g., Informatica Certified Specialist, Informatica Cloud Specialist) are a plus. Experience : 6-10 years of experience in ETL development and data integration, with at least 3 years in a leadership role. Proven experience with Informatica PowerCenter, Informatica Cloud Data Integration, and large-scale ETL implementations. Experience in integrating data from various sources such as databases, flat files, and APIs. Technical Skills: Strong expertise in Informatica PowerCenter, Informatica Cloud, and ETL frameworks. Proficiency in SQL, PL/SQL, and performance optimization techniques. Knowledge of cloud platforms like AWS, Azure, or Google Cloud. Familiarity with big data tools such as Hive, Spark, or Snowflake is a plus. Strong understanding of data modeling concepts and relational database systems. Soft Skills: Excellent leadership and project management skills. Strong analytical and problem-solving abilities. Effective communication and stakeholder management skills. Ability to work under tight deadlines in a fast-paced environment
Posted 1 month ago
1.0 - 2.0 years
5 - 6 Lacs
Mumbai, Lower Parel
Work from Office
Responsibilities: Develop and maintain web applications using the Strapi framework. Design and implement APIs using GraphQL or REST. Customize and extend the Strapi CMS to meet project requirements. Collaborate with front-end developers to integrate user-facing elements with server-side logic. Optimize application performance and ensure scalability. Implement security and data protection measures. Create documentation for APIs and data models. Troubleshoot and debug applications. Stay updated with the latest Strapi releases and best practices. Participate in code reviews and provide constructive feedback. Requirements: Proven experience as a Strapi developer or similar role. Proficient understanding of the Strapi framework and its architecture. Experience with GraphQL or RESTful APIs. Strong JavaScript (Node.js) programming skills. Familiarity with front-end technologies such as React, Vue.js, or Angular. Knowledge of database technologies such as MongoDB, PostgreSQL, or MySQL. Understanding of server-side templating languages (e.g., Handlebars, Pug). Excellent troubleshooting and debugging skills. Good understanding of version control systems (e.g., Git). Experience with cloud platforms (AWS, Azure, Google Cloud) is a plus. Bachelors degree in Computer Science, Engineering, or a related field (preferred).
Posted 1 month ago
5.0 - 7.0 years
30 - 32 Lacs
Pune
Work from Office
Responsibilities Design, develop, and implement high-quality, scalable, and maintainable Java applications and services. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Participate in the entire software development lifecycle, including planning, coding, testing, and deployment. Write clean, efficient, and well-documented code following best practices and coding standards. Troubleshoot and resolve complex issues related to application functionality, performance, and scalability. Conduct code reviews and provide constructive feedback to team members to ensure code quality. Stay up-to-date with industry trends, technologies, and frameworks, and continuously improve technical skills. Mentor and guide junior developers, sharing knowledge and best practices to enhance the overall team's capabilities. Collaborate with stakeholders to understand business requirements and propose innovative solutions. Required Skills Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. At least 4+ years of hands-on experience in Java development, with a strong emphasis on backend development using frameworks such as Spring, Hibernate, or similar. Proficiency in building RESTful web services and APIs. Solid understanding of object-oriented programming principles, design patterns, and software architecture. Experience with database systems likes of MySQL, PostgreSQL, or Oracle. Strong knowledge of version control systems (e.g., Git) and build tools (e.g., Maven, Gradle). Familiarity with Agile development methodologies and practices. Excellent problem-solving and analytical skills, with strong attention to detail. Ability to work independently as well as collaboratively in a team-oriented environment. Strong communication skills to effectively communicate with technical and non-technical stakeholders. Proven track record of successfully delivering high-quality software projects on time. Preferred: Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. Knowledge of front-end technologies like HTML, CSS, and JavaScript frameworks (e.g., React, Angular) is a plus. Experience with performance optimization, scalability, and security best practices in Java applications. Join our dynamic and collaborative team, and contribute to the development of innovative software solutions that solve real-world challenges. Apply your expertise and make an impact on our products and customers. Note: Please provide your portfolio, GitHub profile, or any relevant code samples showcasing your development skills along with your application.
Posted 1 month ago
5.0 - 8.0 years
4 - 7 Lacs
Hyderabad
Work from Office
1. Adobe Experience Platform (AEP) Expertise Understanding of AEPs architecture, data modeling, and functionalities like Real-Time Customer Profile, Data Lake, and Customer Journey Analytics. 2. Adobe Experience Cloud (AEC) Products Hands-on experience with Adobe tools like Adobe Customer Journey Analytics (CJA), Adobe Experience Manager (AEM), Adobe Target, and Adobe Real-Time CDP. 3. Data Ingestion & Transformation Experience with ETL (Extract, Transform, Load) processes, data schemas (XDM - Experience Data Model), and integration of multiple data sources into AEP. 4. Query and Data Management Strong skills in SQL, NoSQL, and Adobe Query Service to process and analyze customer data. 5. Identity Resolution & Identity Graph Knowledge of identity stitching and how AEP manages customer identities across different data sources. 6. Tag Management & SDKs Experience in Adobe Launch (Tags) and Adobe Mobile SDK for data collection and event tracking. 7. Streaming & Batch Data Processing Ability to work with APIs, event-driven architectures, and batch data ingestion into AEP. 8. Cloud & Big Data Technologies Experience with AWS, Azure, Google Cloud, Kafka, Spark, or Hadoop is an added advantage. 9. Scripting & Development Proficiency in JavaScript, Python, or Java for data transformations and API integrations. 10. Adobe Experience Platform APIs Hands-on experience with AEP APIs to automate and extend platform capabilities. Optimize, Automate, and Scale AEP implementations CI/CD & DevOps API Development Cloud Platforms Data Governance & Compliance Identity & Access Management (IAM) Collecting and integrating data from various sources. Centralizing and standardizing customer data for consistency. Segmenting customers to target specific groups effectively. Designing and optimizing customer journeys to enhance engagement. Analyzing data to gain insights and improve marketing strategies. Delivering personalized experiences using data science and machine learning. Ensuring compliance and security of customer data.
Posted 1 month ago
- 5 years
7 - 12 Lacs
Gurugram
Work from Office
The Role and Responsibilities We have open positions ranging from Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence We are hiring for engineering role across the levels from Associate Data Engineer to Lead Data Engineer level for experience ranging from 0-8 years.In addition to the base salary, this position may be eligible for performance-based incentives.
Posted 2 months ago
3 - 8 years
5 - 14 Lacs
Pune
Hybrid
Job Title: Data Engineer Location: Pune, Maharashtra, India Experience: 4+ years Job Type: Full-time Data Engineer What We're Looking For: Education: Bachelors degree in computer science, Engineering, or a related field. Experience: 2-4 years of experience in data engineering, software engineering, or a related field. Technical Skills: Proficiency in SQL, Python, and experience with big data tools such as PySpark or Spark. Cloud Experience: Familiarity with cloud platforms (e.g., AWS, Azure, or Google Cloud) and building scalable, distributed systems. Data Analysis: Experience in data analysis and modeling, with the ability to work with large datasets. Problem-Solving: Strong analytical and problem-solving skills, with a proven ability to address complex data challenges. Project Management: Experience managing technical projects and ensuring timely delivery. Adaptability: Eagerness to learn new technologies and adapt to shifting project requirements. Agile Methodologies: Understanding of Agile delivery frameworks and project management practices.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France