Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 8.0 years
6 - 10 Lacs
Kolkata
Work from Office
Job Summary: We are seeking a highly skilled MLOps Engineer to design, deploy, and manage machine learning pipelines in Google Cloud Platform (GCP). In this role, you will be responsible for automating ML workflows, optimizing model deployment, ensuring model reliability, and implementing CI/CD pipelines for ML systems. You will work with Vertex AI, Kubernetes (GKE), BigQuery, and Terraform to build scalable and cost-efficient ML infrastructure. The ideal candidate must have a good understanding of ML algorithms, experience in model monitoring, performance optimization, Looker dashboards and infrastructure as code (IaC), ensuring ML models are production-ready, reliable, and continuously improving. You will be interacting with multiple technical teams, including architects and business stakeholders to develop state of the art machine learning systems that create value for the business. Responsibilities: Managing the deployment and maintenance of machine learning models in production environments and ensuring seamless integration with existing systems. Monitoring model performance using metrics such as accuracy, precision, recall, and F1 score, and addressing issues like performance degradation, drift, or bias. Troubleshoot and resolve problems, maintain documentation, and manage model versions for audit and rollback. Analyzing monitoring data to preemptively identify potential issues and providing regular performance reports to stakeholders. Optimization of the queries and pipelines. Modernization of the applications whenever required Qualifications: Expertise in programming languages like Python, SQL Solid understanding of best MLOps practices and concepts for deploying enterprise level ML systems. Understanding of Machine Learning concepts, models and algorithms including traditional regression, clustering models and neural networks (including deep learning, transformers, etc.) Understanding of model evaluation metrics, model monitoring tools and practices. Experienced with GCP tools like BigQueryML, MLOPS, Vertex AI Pipelines (Kubeflow Pipelines on GCP), Model Versioning & Registry, Cloud Monitoring, Kubernetes, etc. Solid oral and written communication skills and ability to prepare detailed technical documentation of new and existing applications. Strong ownership and collaborative qualities in their domain. Takes initiative to identify and drive opportunities for improvement and process streamlining. Bachelors Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications: Experience in Azure MLOPS, Familiarity with Cloud Billing. Experience in setting up or supporting NLP, Gen AI, LLM applications with MLOps features. Experience working in an Agile environment, understanding of Lean Agile principles.
Posted 1 day ago
2.0 - 5.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Summary:We are seeking a skilled Python Developer with hands-on experience in REST API development, cloud-native application design, and SQL. The ideal candidate will be responsible for building scalable, secure, and efficient applications and APIs, and for integrating cloud services into modern development pipelines. Key Responsibilities:Design, develop, and deploy RESTful APIs using Python frameworks.Build and maintain cloud-native applications on platforms such as AWS, Azure, or GCP.Write efficient, reusable, and reliable Python code following best practices.Optimize SQL queries and interact with relational databases for data storage and retrieval.Collaborate with DevOps, QA, and product teams to deliver high-quality software solutions.Ensure performance, scalability, and security of the application components. Required Skills:Proficiency in Python (3+ years of experience Preferred).Strong Experience Developing And Consuming REST APIs.Working Knowledge Of cloud-native architectures and services (AWS/Azure/GCP).Solid understanding of SQL and experience with relational databases (e.g., PostgreSQL, MySQL).Familiarity with containerization tools like Docker and orchestration tools like Kubernetes is a plus.Good to Have:Experience with asynchronous programming in Python (e.g., asyncio, FastAPI).Familiarity with CI/CD pipelines and version control (Git).Knowledge of API authentication mechanisms such as OAuth2 or JWT.Exposure to Agile/Scrum methodologies. Skills: jwt,sql,gcp,kubernetes,aws,cloud-native application design,postgresql,oauth2,azure,api,scrum,asynchronous programming,mysql,ci/cd,databases,git,python,rest api development,agile,docker
Posted 1 day ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
What you'll be doing Designing and delivering the technical elements of our analytics projects Applying descriptive and predictive analytics techniques including but not limited to: customer segmentation, market basket analysis, topic & sentiment analysis, predictive modelling / machine learning models and data visualization, Simplifying complex analysis into clear and compelling recommendations that enable the project team to achieve our client’s desired outcomes Co-ordinating integration, structuring, visualization and data analysis activities Actively contributing to exchanges and sharing of best practices related to data Developing efficient operations processes for high availability of Global Data platform Work as part of our global team; our operational, Insight and IT teams work together to solve data challenges What experience you'll need Minimum 4 years’ experience working with the data as a data scientist Have worked on cloud computing environments such as Azure, GCP or AWS Experience working with large datasets and build data models Advanced knowledge of machine learning techniques e.g., forecasting, recommender engine, predictive modelling, clustering Experience working with text mining e.g. building topics, sentiments, summarization etc. use cases Advanced knowledge of python Experience working with GenAI, large language models Experience building, testing and deploying data visualization dashboards (PowerBI/tableau) Show more Show less
Posted 1 day ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Job Overview: We are seeking a highly skilled Senior Fullstack Engineer to join our dynamic team. This role demands extensive experience in both frontend and backend development, along with a strong grasp of cloud technologies and database management. You will work closely with the Engineering team, Product team, and other stakeholders to design and implement scalable, secure, and high-performance solutions. As a technical leader, you will ensure adherence to best practices, provide mentorship, and drive cross-functional collaboration. Responsibilities Technical Ownership: Design and architect complex, scalable full-stack solutions across multiple teams and systems. Hands-on Development: Write clean, maintainable, and efficient code primarily in React (frontend) and Node.js (backend). Cross-Functional Collaboration: Work with product, design, QA, and DevOps to drive alignment and deliver business value. Code & Design Reviews: Set and enforce coding standards, review code regularly, and guide design discussions. Scalability & Performance: Optimize applications for speed, efficiency, and scalability across services and UIs. Mentorship: Guide and upskill senior/staff engineers and engineering leads. Drive best practices and continuous learning. Tech Strategy: Contribute to long-term technology vision, evaluate new tools/frameworks, and de-risk architectural decisions. DevOps and CI/CD: Collaborate on infrastructure automation, deployment pipelines, and observability practices. Security & Compliance: Ensure engineering outputs meet high standards of security, data privacy, and compliance (e.g., GLBA, GDPR,CCPA etc). Must Have Experience Requirements Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 12+ years of software engineering experience, including 7+ years in full-stack development. Deep expertise in React, Node.js, and TypeScript. Proven experience architecting complex systems and influencing tech direction at scale. Solid understanding of data structures, system design, API design, and microservices. Experience with cloud-native apps, containers, and DevOps workflows. Strong communication, mentorship, and leadership-by-influence skills. Tech Stack: Frontend: React, Redux/Context API, TypeScript, Tailwind/CSS-in-JS Backend: Node.js, Express/Nest.js, TypeScript API: REST, GraphQL Database: PostgreSQL, MongoDB, Redis Infra/DevOps: Docker, Kubernetes, GitHub Actions, AWS/GCP Testing: Jest, Cypress, React Testing Library, Supertest Leadership & Team Proven experience in coaching and mentoring a team of developers. Proven track record of delivering complex projects successfully. Ability to conduct code reviews and provide constructive feedback. Experience in agile methodologies (Scrum, Kanban). Ability to manage project timelines and deliverables effectively. Excellent verbal and written communication skills. Ability to explain technical concepts to non-technical stakeholders. Strong analytical and problem-solving skills. Ability to troubleshoot and resolve complex technical issues. Experience in working with cross-functional teams (designers, product managers, QA). Ability to quickly learn and adapt to new technologies and frameworks. Perks Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Show more Show less
Posted 1 day ago
8.0 - 12.0 years
35 - 60 Lacs
Bengaluru
Work from Office
Job Summary NetApp Platform Product Team is responsible for building, launching, and managing NetApp’s core storage and data management products. The team builds the foundational product capabilities behind NetApp’s flagship storage platforms and is at the forefront of driving a transformational journey at NetApp by accelerating innovation across NetApp’s hybrid cloud portfolio (Public, Private & On-premises). AI Data platform is a key focus area for the Data Services group and involves the development and alignment of Product, Technical Marketing, GTM strategies, Customer Value Enablement and Lifecycle Management practices across the NetApp product portfolio. The goal of the Product Management team is to enable ongoing customer success, increase product adoption and drive customer lifetime value growth across all market segments. This (TME) role will be responsible for the AI data platform for NetApp ONTAP storage and will work closely with key Field Technical teams, Product Management, Engineering, Product Marketing, and other key partners. In this role, you will focus on delivering key technical value propositions to the Field and Customers through the creation of presentations, solution briefs, blogs, technical white papers, conducting proactive Field training/enablement sessions and creating Product-Field cohesion by influencing both Product Development efforts and Field initiatives. The successful candidate has a passion for understanding key customer requirements for major customer segments and use cases, identifying key product value drivers and establishing how NetApp delivers differentiated value drivers for driving customer technical and business outcomes. Job Requirements GTM Collateral Creation Create competitive collaterals (Why NetApp Storage - presentations/videos/blogs/solution briefs, etc.) Pre-sales guides and demo narratives to establish technical wins Technical white papers, reference architectures, etc. Establish demonstration narratives for solutions and products Partner integrations and solution validations Sales Enablement Conduct periodic sales/partner training and information delivery sessions Identify what’s working/not working from a pre-sales perspective and make course corrections in technical messaging Launches and Events Support product launch activities and assist with Trade Shows, Conferences and Technical Sales Training or Channel Partner Training activities Sales Support Respond to customer or field inquiries, assist with Technical RFPs and work with a cross-functional team of PMs, Architects, and Engineers as required Product Development Influence Influence product roadmap and development priorities with “outside-in” field and customer perspectives Stay current with trends and competitors to identify improvements or recommend product enhancements. Capabilities: SME in AI & ML and Knowledge of Artificial intelligence concepts including server/storage architecture, batch/stream processing, data warehousing, data lakes, distributed filesystems, OLTP/OLAP databases and data pipelining tools, model training, inferencing as well as RAG workflows. Data storage, virtualization, knowledge on hypervisors such as VMware ESX, Linux KVM Unix-based operating system kernels and development environments, e.g. Linux or FreeBSD Experience with the Cloud Hyperscaler’s and their services (Amazon Web Services, Microsoft Azure, Google Cloud Platform) Prior experience with NetApp ONTAP or any other storage is preferred Experience in building test environments including compute, storage, networking components & enterprise workloads both in on-prem and in the cloud. Experience in creating content, including but not limited to videos, presentations, blogs, and technical documents. Communicate in a clear, concise, and professional manner, tailoring the message to the audience, including both verbal and written communications. Lead, influence, and work with a global team. Education A minimum of 8 years of relevant experience is required; 10 to 12 years of experience is preferred working in enterprise software and IT infrastructure or technical marketing. A Bachelor of Science/computer or advanced degree in electrical engineering, or computer science or related technical discipline; or equivalent experience is required. Demonstrated ability to have completed multiple, complex technical projects. Demonstrated experience in authoring technical papers, as well as project conceptualization and planning are required.
Posted 1 day ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Data Engineer Location: Gurugram (On-site) Company: Darwix AI Employment Type: Full-Time About the Role: Darwix AI is looking for a Data Engineer with strong experience in Python, CI/CD pipeline setup, and system architecture design. You will be responsible for building and optimizing data pipelines, ensuring seamless integration of data across systems, and supporting scalable infrastructure to power our AI-driven solutions. Key Responsibilities: Design, implement, and maintain efficient data pipelines and workflows Set up and manage CI/CD pipelines for automated deployment and testing Develop scalable system architecture for real-time and batch data processing Collaborate with cross-functional teams to deliver production-ready data solutions Monitor data flow, ensure data integrity, and optimize pipeline performance Requirements: Proficiency in Python with a strong understanding of data structures and algorithms Experience in setting up CI/CD pipelines using tools like GitLab CI, Jenkins, or similar Solid understanding of system architecture and distributed data processing Familiarity with cloud platforms (AWS/GCP/Azure) and containerization tools like Docker 3–5 years of experience in a similar data engineering role Show more Show less
Posted 1 day ago
8.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
JOB DESCRIPTION [Job Title] SAP BASIS DBA Expert with hands-on exposure in OS (Linux/Windows) and CLOUD Products with exposure in BTP. [Project Details]: 8 -10 years of IT industry experience encompassing a wide range of skill set, roles on SAP Basis-HANA (DBA) consultant. Extensive experience with Administration, Installation, Implementation, Migrations and Upgrades of SAP applications. Widely experienced in Maintenance Business Process, supervision, team leading, training and skills transfer. Good communication skills, interpersonal skills, self-motivated, quick learner, team player. Extensive experience with Administration, Installation, Implementation, Migrations and Upgrades of SAP applications. Widely experienced in Maintenance Business Process, supervision, team leading, training and skills transfer. Good communication skills, interpersonal skills, self-motivated, quick learner, team player [Technology and Sub-technology] SAP Basis: S4HANA, SAP R/3 4.6B, SAP R/3 4.7, SAP Net-Weaver - 7.0 ,7.1,7.3,7.4, S/4 HANA Cloud 1702,1705, S/4 HANA on Premise 1511,1610,1809 SAP Components: S4HANA, ECC, BW, SRM, APO, CRM, PI, SOLMAN Operating System: Solaris, Linux, AIX, Windows Server Hands-On exposure for SAP and Database Cluster Database: HANA, (mandatory) , Good to have Oracle, DB2 LUW, Livecache knowledge Unix Shell Scripting Good understanding of Network Flow Working exposure in SOLMAN CHARM Practical Exposure on AWS/Azure Exposure on BTP [Base Location]: Bangalore [Type]: In Office [Qualifications] B-TECH or above [Job Overview]: (1) Liaise with key members of the business team to understand business impact of critical incidents and analyze the root cause. Ensure learning from these incidents is applied for future benefit. (2) Monitor System Performance and Batch Jobs related to SAP Basis and ensure proactive measures are taken to prevent major incidents. (3) Work closely with support team and plan for SAP Basis support related activities. - Establish good working relationship with BP Business (4) S/4 Version Upgrade Hand-On experience (5) Ensure continuity in service delivery. Ensure offshore team has adequate knowledge to prepare technical design document. (6) Understand the new business requirement and accordingly Functional Design Document will be prepared and the same will be delivered as per change management process. (7) Focus constantly on areas to improve. Introduce measures to monitor and continuously improve response time and restoration time of incidents. (8) Use SAP Basis expertise to help improve the resolution for any sort of failure Scenarios. (9) Managing Technical cutovers and other new tasks like SAP upgrade & OS-DB Migration. (9) Providing Guidance and support for team members for any SAP Basis activities. - Sharing knowledge through regular KT sessions. List only must have skills needed for the role List them in order of importance SAP BASIS with DBA (HANA, DB2 LUW, Oracle, etc) AWS practical exposure BTP Knowledge [Good to have Skills]: Knowledge in other Cloud technology like Azure, GCP, etc [Responsibilities and Duties]: Performing BASIS and DBA technical steps S/4 Version Upgrade, EHP Upgrade Performed OS related tasks BTP exposure Coordination with End Users Discussion with Stakeholders on deliveries Mentoring Team on technical areas [Keywords] BASIS, HANA, S4 HANA, DBA, BTP, AWS Roles and Responsibilities JOB DESCRIPTION [Job Title] SAP BASIS DBA Expert with hands-on exposure in OS (Linux/Windows) and CLOUD Products with exposure in BTP. [Project Details]: 8 -10 years of IT industry experience encompassing a wide range of skill set, roles on SAP Basis-HANA (DBA) consultant. Extensive experience with Administration, Installation, Implementation, Migrations and Upgrades of SAP applications. Widely experienced in Maintenance Business Process, supervision, team leading, training and skills transfer. Good communication skills, interpersonal skills, self-motivated, quick learner, team player. Extensive experience with Administration, Installation, Implementation, Migrations and Upgrades of SAP applications. Widely experienced in Maintenance Business Process, supervision, team leading, training and skills transfer. Good communication skills, interpersonal skills, self-motivated, quick learner, team player [Technology and Sub-technology] SAP Basis: S4HANA, SAP R/3 4.6B, SAP R/3 4.7, SAP Net-Weaver - 7.0 ,7.1,7.3,7.4, S/4 HANA Cloud 1702,1705, S/4 HANA on Premise 1511,1610,1809 SAP Components: S4HANA, ECC, BW, SRM, APO, CRM, PI, SOLMAN Operating System: Solaris, Linux, AIX, Windows Server Hands-On exposure for SAP and Database Cluster Database: HANA, (mandatory) , Good to have Oracle, DB2 LUW, Livecache knowledge Unix Shell Scripting Good understanding of Network Flow Working exposure in SOLMAN CHARM Practical Exposure on AWS/Azure Exposure on BTP [Base Location]: Bangalore [Type]: In Office [Qualifications] B-TECH or above [Job Overview]: (1) Liaise with key members of the business team to understand business impact of critical incidents and analyze the root cause. Ensure learning from these incidents is applied for future benefit. (2) Monitor System Performance and Batch Jobs related to SAP Basis and ensure proactive measures are taken to prevent major incidents. (3) Work closely with support team and plan for SAP Basis support related activities. - Establish good working relationship with BP Business (4) S/4 Version Upgrade Hand-On experience (5) Ensure continuity in service delivery. Ensure offshore team has adequate knowledge to prepare technical design document. (6) Understand the new business requirement and accordingly Functional Design Document will be prepared and the same will be delivered as per change management process. (7) Focus constantly on areas to improve. Introduce measures to monitor and continuously improve response time and restoration time of incidents. (8) Use SAP Basis expertise to help improve the resolution for any sort of failure Scenarios. (9) Managing Technical cutovers and other new tasks like SAP upgrade & OS-DB Migration. (9) Providing Guidance and support for team members for any SAP Basis activities. - Sharing knowledge through regular KT sessions. List only must have skills needed for the role List them in order of importance SAP BASIS with DBA (HANA, DB2 LUW, Oracle, etc) AWS practical exposure BTP Knowledge [Good to have Skills]: Knowledge in other Cloud technology like Azure, GCP, etc [Responsibilities and Duties]: Performing BASIS and DBA technical steps S/4 Version Upgrade, EHP Upgrade Performed OS related tasks BTP exposure Coordination with End Users Discussion with Stakeholders on deliveries Mentoring Team on technical areas [Keywords] BASIS, HANA, S4 HANA, DBA, BTP, AWS
Posted 1 day ago
8.0 - 12.0 years
11 - 18 Lacs
Faridabad
Remote
We are seeking an experienced and highly skilled Senior Data Scientist to drive data-driven decision-making and innovation. In this role, you will leverage your expertise in advanced analytics, machine learning, and big data technologies to solve complex business challenges. You will be responsible for designing predictive models, building scalable data pipelines, and uncovering actionable insights from structured and unstructured datasets. Collaborating with cross-functional teams, your work will empower strategic decision-making and foster a data-driven culture across the organization. Role & responsibilities 1. Data Exploration and Analysis: Collect, clean, and preprocess large and complex datasets from diverse sources, including SQL databases, cloud platforms, and APIs. Perform exploratory data analysis (EDA) to identify trends, patterns, and relationships in data. Develop meaningful KPIs and metrics tailored to business objectives. 2. Advanced Modeling and Machine Learning: Design, implement, and optimize predictive and prescriptive models using statistical techniques and machine learning algorithms. Evaluate model performance and ensure scalability and reliability in production. Work with both structured and unstructured data for tasks such as text analysis, image processing, and recommendation systems. 3. Data Engineering and Automation: Build and optimize scalable ETL pipelines for data processing and feature engineering. Collaborate with data engineers to ensure seamless integration of data science solutions into production environments. Leverage cloud platforms (e.g., AWS, Azure, GCP) for scalable computation and storage. 4. Data Visualization and Storytelling: Communicate complex analytical findings effectively through intuitive visualizations and presentations. Create dashboards and visualizations using tools such as Power BI, Tableau, or Python libraries (e.g., Matplotlib, Seaborn, Plotly). Translate data insights into actionable recommendations for stakeholders. 5. Cross-functional Collaboration and Innovation: Partner with business units, product teams, and data engineers to define project objectives and deliver impactful solutions. Stay updated with emerging technologies and best practices in data science, machine learning, and AI. Contribute to fostering a data-centric culture within the organization by mentoring junior team members and promoting innovative approaches. Preferred candidate profile Proficiency in Python, R, or other data science programming languages. Strong knowledge of machine learning libraries and frameworks (e.g., Scikit-learn, Tensor Flow, PyTorch). Advanced SQL skills for querying and managing relational databases. Experience with big data technologies (e.g., Spark, Hadoop) and cloud platforms (AWS, Azure, GCP), preferably MS Azure. Familiarity with data visualization tools such as Power BI, Tableau, or equivalent, preferably MS Power BI. Analytical and Problem-solving Skills: Expertise in statistical modeling, hypothesis testing, and experiment design. Strong problem-solving skills to address business challenges through data-driven solutions. Ability to conceptualize and implement metrics/KPIs tailored to business needs. Soft Skills: Excellent communication skills to translate complex technical concepts into business insights. Collaborative mindset with the ability to work in cross-functional teams. Proactive and detail-oriented approach to project management and execution. Education and Experience: Bachelors or Masters degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. 8+ years of experience in data science, advanced analytics, or a similar field. Proven track record of deploying machine learning models in production environments. Perks & Benefits Best as per market standard. Work from home opportunity. 5 days working. Shift Timing 2PM-11PM IST (Flexible hours)
Posted 1 day ago
12.0 - 17.0 years
14 - 18 Lacs
Pune
Work from Office
Youll break new ground by: E2E Responsibility: Take complete responsibility for development, ensuring alignment and the successful delivery of solutions. Hands-on Approach: Use deep technical knowledge to inspire, guide, and support the development team and collaborate closely with solution architects. Development Standards: Lead the development of secure, robust, scalable, and maintainable software, ensuring compliance with quality processes and product standards. Budget Management: Create, align, and track budgets for all development activities. Strategic Development: Continuously evolve the development factory concept, including scaling up in volume, technologies and maturity. Define and operationalize steering framework on KPIs and improve maturity level of development in various dimensions, e.g. performance, quality, cost efficiency. Workload and Resource Planning: Plan, forecast, and track development workloads and resources. Activity Coordination: Collaborate with Release Management to plan, forecast, and track development activities. Quality Assurance: Serve as the main contact for Quality Assurance in DEV input and results, requiring deep SAP understanding and experience. Partner Management: Manage resources and performance of internal services and external partners. Youre excited to build on your existing expertise, including : 12+ years of proven experience in project management of projects related to SAP S4 HANA implementations, or a similar role. Qualified B.E / BTech / MCA or equivalent from reputed institute. Experience in handling multiple projects simultaneously using scrum of scrum. Experience in stakeholder management. Certifications in project management (e.g., PMP). Strong understanding of software development lifecycle (SDLC) and Agile methodologies. Proficiency with release management tools and CI/CD pipelines. Excellent problem-solving and troubleshooting skills. Strong organizational and project management skills. Ability to work effectively in a fast-paced and dynamic environment. Excellent communication and collaboration skills. You should have experience of working with offshore / global teams. You should be willing to learn and adapt new technologies. Exposure to & experience in multiple of the following technological areas: S4/HANA development, cloud development, integration and deployment on AWS, GCP, or Azure. Prior experience in one of the programming languages such as ABAP, Python, Java, Javascript. Comply with development standards and methodologies (DevOps, Scrum, quality, security, performance). Proficiency in using tools like SAP Solution Manager as application lifecycle management tool for SAP. Proficiency in Advanced Business Application Programming (ABAP) is crucial for developing and customizing SAP applications. Deep knowledge of user interfaces like SAP Fiori and SAPUI5. Familiarity with SAP BTP for developing and deploying applications in the cloud. Proficient in modeling, documenting, and simulating business processes using SAP Signavio. Project Management: Experience with project management methodologies and tools like Azure boards, MS Project etc. Data Analysis: Strong skills in data analysis and reporting. Technology Integration: Experience in integrating various technologies and platforms. Experience with version control systems like Git, SVN to manage code changes and releases.
Posted 1 day ago
3.0 years
0 Lacs
Sahibzada Ajit Singh Nagar, Punjab, India
On-site
Job Summary We are looking for an experienced Senior Node.js Developer to join our development team. The ideal candidate will have a passion for backend development, expertise in building scalable applications, and a deep understanding of server-side logic. You’ll work with a team of talented engineers to develop, improve, and scale our backend systems. Key Responsibilities Backend Development: Design and implement APIs, web services, and backend systems using Node.js. Database Management: Develop and optimize database solutions, ensuring data consistency, integrity, and security. Collaboration: Work closely with frontend developers, designers, and other team members to create a cohesive product. Testing & Debugging: Write unit tests, perform debugging, and ensure the codebase meets quality standards. Scalability: Ensure the application is scalable, maintainable, and performs well under high traffic. Code Review: Participate in code reviews, share feedback, and promote best practices for coding. Documentation: Maintain clear, comprehensive documentation for system architecture, APIs, and codebase. Requirements Bachelor’s degree in Computer Science, Engineering, or a related field. Experience: 3+ years of experience in backend development with Node.js. Proficiency in JavaScript and ES6+ features. Experience with RESTful and/or GraphQL API design. Knowledge of frameworks like Express.js or Koa. Familiarity with database technologies (e.g., MongoDB, MySQL, PostgreSQL). Understanding of containerization (Docker) and cloud services (AWS, GCP). Skills:- MongoDB, Express, es6 and Socket Programming Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Size Mid-Sized Experience Required 5 - 10 years Working Days 5 days/week Office Location Karnataka, Bengaluru Role & Responsibilities We are seeking an experienced and passionate Software Engineering Manager to lead our team of talented software engineers. In this role, you will be responsible for driving the development and delivery of high quality software products, fostering a culture of collaboration and innovation, and ensuring alignment with our business goals. You will lead the backend engineering team responsible for the customer-facing side of our cloud-based, mid-size eCommerce platform. This team plays a critical role in delivering a fast, reliable, and secure experience to thousands of customers daily. You'll oversee the design and development of scalable APIs, data services, and integrations that power our user experience, checkout processes, and product discovery features. As the team’s leader, you'll guide architecture decisions, mentor engineers, and collaborate closely with frontend, DevOps, and product teams to ensure seamless, high-performance delivery. Lead and mentor a team of software engineers, fostering professional growth and development. Architecture, Design, develop, troubleshoot and implement rich cloud native software applications. Manage project timelines, deliverables, and resource allocation to meet business objectives. Collaborate cross-functionally with product managers, designers, and other stakeholders to define technical requirements and project scope. Ensure high standards of software quality through code reviews, automated testing, and best practices. Drive continuous improvement in engineering practices, tools, and processes. Identify and mitigate technical risks and issues throughout the development lifecycle. Recruit and retain top engineering talent by creating an inclusive and engaging team environment. Contribute to planning and decision-making at the engineering and company level. Ideal Candidate Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience (5+ years) in software development with at least 2 years in a management role. Experience working in an ecommerce background Strong technical background with proficiency in one or more programming languages such as Java, Python, Go etc. Demonstrated experience in agile software development methodologies. Excellent communication, interpersonal, and organizational skills. Ability to manage multiple priorities and navigate a fast-paced environment. Experience with cloud platforms (e.g., AWS, Azure, GCP) and modern DevOps practices is a plus. Perks, Benefits and Work Culture Work with cutting-edge technologies on high-impact systems. Be part of a collaborative and technically driven team. Enjoy flexible work options and a culture that values learning. Competitive salary, benefits, and growth opportunities. Skills: project management,python,agile software development,java,devops,engineers,cloud platforms (aws, azure, gcp),software,ecommerce,cloud,software development,code reviews,go,software development management,collaboration,mentoring,architecture,automated testing Show more Show less
Posted 1 day ago
8.0 - 13.0 years
25 - 30 Lacs
Mumbai
Work from Office
Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Technical Skills: Strong proficiency in .Net, .Net Core, C#, REST API. Strong expertise in PostgreSQL. Additional preferred Skills: Docker, Kubernetes. Cloud : GCP and Services: Google Cloud Storage, Pub/Sub. Monitor Tools: Dynatrace, Grafana, API Security and tooling (SonarQube) Agile Methodology Key Responsibilities: Design, develop and maintain scalable C# applications and microservice implementation. Implement RESTful APIs for efficient communication between client and server applications. Collaborate with product owners to understand requirements and create technical specifications. Build robust database solutions and ensure efficient data retrieval. Write clean, maintainable, and efficient code. Conduct unit testing, integration testing, and code reviews to maintain code quality. Work on implementation of Industry Standard protocols related to API Security including OAuth. Implement scalable and high-performance solutions that integrate with Pub/Sub messaging systems and other GCP services BQ, Dataflows, Spanner etc. Collaborate with cross-functional teams to define, design, and deliver new features. Integrate and manage data flows between different systems using Kafka, Pub/Sub, and other middleware technologies. Qualifications: Bachelors Degree or International equivalent 8+ years of IT experience in .NET. Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred
Posted 1 day ago
8.0 - 13.0 years
25 - 30 Lacs
Mumbai
Work from Office
Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Technical Skills: Strong proficiency in .Net, .Net Core, C#, REST API. Strong expertise in PostgreSQL. Additional preferred Skills: Docker, Kubernetes. Cloud : GCP and Services: Google Cloud Storage, Pub/Sub. Monitor Tools: Dynatrace, Grafana, API Security and tooling (SonarQube) Agile Methodology Key Responsibilities: Design, develop and maintain scalable C# applications and microservice implementation. Implement RESTful APIs for efficient communication between client and server applications. Collaborate with product owners to understand requirements and create technical specifications. Build robust database solutions and ensure efficient data retrieval. Write clean, maintainable, and efficient code. Conduct unit testing, integration testing, and code reviews to maintain code quality. Work on implementation of Industry Standard protocols related to API Security including OAuth. Implement scalable and high-performance solutions that integrate with Pub/Sub messaging systems and other GCP services BQ, Dataflows, Spanner etc. Collaborate with cross-functional teams to define, design, and deliver new features. Integrate and manage data flows between different systems using Kafka, Pub/Sub, and other middleware technologies. Qualifications: Bachelors Degree or International equivalent 8+ years of IT experience in .NET/C#. Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred
Posted 1 day ago
2.0 years
0 Lacs
India
On-site
This job is with Parexel, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Job Purpose Provide medical review, analysis and guidance during the case handling and reporting cycle of Adverse Event and Adverse Reaction reports received for investigational and marketed products Provide medical guidance and input to Drug Safety Associates (DSAs) and specialists in medical aspects of drug safety Function as pharmacovigilance representative/safety scientist General Maintaining a good working knowledge of the Adverse event profile of assigned products, labeling documents, data handling conventions, client's guidelines and procedures, and international drug safety regulations Maintaining an awareness of global regulatory requirements and reporting obligations and organizing workload to ensure compliance with internal and regulatory timelines for adverse event reporting Maintaining excellent knowledge of the safety profile of assigned products Communicating and discussing issues related to review process with Project Manager Interacting with internal and external contacts for resolving issues Maintaining a good working knowledge of relevant regulatory guidelines Attend and present client/cross functional meetings along with other stakeholders Training and mentoring new team member, as required Working as Subject Matter Experts (SMEs) Assisting the Manager for inspection readiness activities and audits Provides inputs for process improvisations Works closely with Manger for process co-ordination and to ensure meeting all KPIs for the process. Case report Medical review (as applicable) Performing medical review of cases according to client Standard Operating Procedures (SOPs) and liaising with the client, as required Writing Pharmacovigilance/Marketing Authorization Holder (MAH) comment and assessing company causality Assessing seriousness and expectedness of reported events Providing medical advice to DSPs and case processing team Skills Excellent interpersonal, verbal and written communication skills Computer proficiency, an ability to work with web-based applications and familiarity with the Windows operating system Client focused approach to work A flexible attitude with respect to work assignments and new learnings Ability to manage multiple and varied tasks with enthusiasm and prioritize workload with attention to detail Ability to assess the clinical relevance of medical data and to interpret its clinical meaning is essential Willingness to work in a matrix environment and to value the importance of teamwork Strong knowledge of international drug regulation including GCP, GVP Knowledge And Experience Relevant experience of minimum 2 Years in Pharmacovigilance/ drug safety is desirable. Education MBBS/Post Graduation in Medicine. Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
India
On-site
We are looking for a motivated and curious Generative AI Engineer (Junior to Mid-Level) to join our growing Generative AI team. In this role, you’ll work closely with senior engineers to help develop and deploy cutting-edge generative AI systems across real business use cases. You'll contribute to solutions that integrate large language models (LLMs), embeddings, vector databases, and GenAI development frameworks. This is a hands-on role that offers the opportunity to work with production-grade GenAI technologies, grow your skill set, and have a real impact on user-facing AI features. Role and Responsibilities Typical duties and responsibilities for Gen-AI engineer position may include but not limited to: Support the fine-tuning, deployment, and integration of large language models (LLMs) and other generative models (e.g., GPT, Mistral). Build and maintain retrieval-augmented generation (RAG) pipelines using embeddings and vector databases like FAISS, Pinecone, or Chroma. Contribute to prompt engineering, schema design, and function calling logic to guide LLM behavior in production workflows. Work with GenAI development frameworks to build, orchestrate, route, and serve GenAI-powered features, workflows, and agents. Stay up to date with the latest advancements in generative AI, prompt tooling, and language model APIs. Participate in experiments exploring new architectures, model capabilities, and prompting techniques. Help evaluate model behaviors and document findings to support continual improvement. Work under the guidance of senior AI engineers and participate in code reviews, design discussions, and technical planning. Take ownership of well-scoped components and contribute to production releases. Embrace a learning-first environment and help foster a collaborative, transparent team culture. Collaborate with cross-functional teams (engineering, product, design) to deliver user-facing GenAI features. Implement backend APIs and services powered by LLMs, including local and cloud-hosted models. Support LLM observability efforts to track pipelines’ execution, logs, LLM calls, performance, latency, and failure cases. Assist in evaluating and mitigating bias, hallucination, and safety risks in generative AI systems. Ensure that outputs and user-facing AI behavior follow responsible AI principles and privacy standards. Knowledge sharing and culture building as we work in team structure, it is very essential to have the spirit of sharing the knowledge. Keep aligned with our team(s) coding and design standards. Ability to communicate and work well with others. Ability to deliver on time keeping a high quality of work. To quickly become an expert in our tech stack, specifically the systems relating to your role. Requesting training when required. Check in with your line manager and update on your progress. Qualifications: Bachelor’s degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field (or equivalent hands-on experience). 2+ years of experience in software development or applied machine learning (including internships, research, or open-source contributions). Proven hands-on experience working with LLM APIs Practical exposure to embeddings , vector databases , and RAG workflows. Experience using core frameworks such as LangChain , LangGraph , LiteLLM , or Ollama . Proficiency in Python and familiarity with AI/ML libraries such as Hugging Face Transformers or PyTorch. Understanding of prompt engineering, output parsing, and function/tool calling with LLMs. Familiarity with vector databases (e.g., FAISS, Pinecone, Weaviate, Chroma) and semantic retrieval techniques. Basic understanding of structured outputs using Pydantic or JSON schema. Experience with LLM observability tools like LangSmith , LangFuse , Opik , etc.. Comfortable working with Git, APIs, and cloud platforms (AWS, GCP, etc.) Experience building or contributing to agentic workflows, multi-step LLM tools, or stateful AI agents is a Plus. Awareness of ethical AI challenges, bias mitigation, and model evaluation techniques is a Plus. Contributions to open-source AI projects or technical writing in the GenAI space is a Plus. Strong analytical and problem-solving abilities. Excellent communication and collaboration skills across technical and non-technical teams. Excellent verbal and written English communication skills. High attention to detail, curiosity, and willingness to experiment and learn. Show more Show less
Posted 1 day ago
3.0 - 6.0 years
0 - 0 Lacs
Chennai
Work from Office
Job Description: We are seeking a highly skilled Data Engineer with a minimum of 3 years of experience in analytical thinking and working with complex data sets. The ideal candidate will have a strong background in SQL, BigQuery, and Google Cloud Platform (GCP), with hands-on experience in developing reports and dashboards using Looker Studio, Looker Standard, and LookML. Excellent communication skills and the ability to work collaboratively with cross-functional teams are essential for success in this role. Key Responsibilities: Design, develop, and maintain dashboards and reports using Looker Studio and Looker Standard. Develop and maintain LookML models, explores, and views to support business reporting requirements. Optimize and write advanced SQL queries for data extraction, transformation, and analysis. Work with BigQuery as the primary data warehouse for managing and analyzing large datasets. Collaborate with business stakeholders to understand data requirements and translate them into scalable reporting solutions. Implement data governance, access controls, and performance optimizations within the Looker environment. Perform root-cause analysis and troubleshooting for reporting and data issues. Maintain documentation for Looker projects, data models, and data dictionaries. Stay updated with the latest Looker and GCP features and best practices. Qualifications: Minimum of 3 years of experience in data analysis, with a focus on analytical thinking and working with complex data sets. Proven experience in creating data stories and understanding metric and channel trends. Strong proficiency in SQL, BigQuery, and Google Cloud Platform (GCP). Hands-on experience with Looker Studio, Looker Standard, and LookML. Excellent communication skills and the ability to work collaboratively with cross-functional teams. Strong problem-solving skills and attention to detail. Ability to manage multiple tasks and projects simultaneously.
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description As a burgeoning tech hub, Hyderabad is the canvas where tradition meets innovation. At Piazza Consulting Group, we champion this fusion, driving the future of business consultancy. We're inviting you – the game-changers and tech enthusiasts – to be a part of our dynamic Innovation Hub in Hyderabad. If you're fueled by the challenges of intricate business problems and are keenly interested in Machine Learning technologies, we're looking for you. Role Description This is a full-time hybrid role for an AI Engineer at Piazza Consulting Group in Hyderabad. The AI Engineer will design, develop, and deploy advanced AI solutions, focusing on Large Language Models and natural language processing. Responsibilities include building and fine-tuning LLMs, integrating AI-driven solutions with business applications, and collaborating with cross-functional teams to deliver scalable software. The role also involves staying ahead of AI advancements, optimizing model performance, and ensuring robust testing and debugging. Qualifications Expertise in Large Language Models (LLMs) and transformer architectures Proficiency in Natural Language Processing (NLP) techniques Strong foundation in Computer Science and Software Development Experience with programming languages like Python, Java, or C++ Familiarity with AI frameworks such as TensorFlow, PyTorch, or Hugging Face Transformers Knowledge of model fine-tuning, prompt engineering, and embeddings Exceptional problem-solving and analytical skills Strong communication and teamwork capabilities Master’s or PhD in Computer Science, AI, Data Science, or a related field Preferred Qualifications Experience deploying LLMs and AI models in production environments 5-10 years of experience in Software Engineering or AI development Expertise with cloud platforms (AWS, Azure, GCP) for AI workloads Familiarity with MLOps practices for model lifecycle management Proven ability to collaborate in multidisciplinary teams with excellent communication skills Industry • Product, Business Consulting and AI-Driven Services Show more Show less
Posted 1 day ago
10.0 - 16.0 years
19 - 27 Lacs
Hyderabad, Gurugram, Mumbai (All Areas)
Work from Office
UI/UX Architect (10+ Years) PAN India N.P: 30 days GCP , node.js ,react.js UI/UX Architect with 10+ years of experience in designing and leading frontend architecture across web and mobile platforms(mobile platforms is nice to have). Will define the UI/UX vision, architect scalable and secure solutions using React.js, React Native, and Node.js, and ensure seamless integration with GCP Cloud Monitoring and observability tools.Need to authenticate with the GCP services using SSO. Key Skills: Frontend architecture (React.js), Design Systems, Node.js UI/UX strategy and leadership Integration with GCP services (Cloud Monitoring, API Gateway, IAM) Design governance, performance optimization, accessibility (WCAG) Stakeholder collaboration, mentoring, code quality enforcement Nice to have skills:- React Native. Mobile app development.
Posted 1 day ago
2.0 - 5.0 years
4 - 7 Lacs
Navi Mumbai
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 day ago
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Java 8 and above SpringBoot, Rest API and Containerization with Docker. Development experience in Java. Good communication skills. Any cloud exposure AWS/GCP/Azure Preferred technical and professional experience Creative Problem solving skills Good communication skills
Posted 1 day ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
As a key member of our dynamic team, you will play a vital role in crafting exceptional software experiences. Your responsibilities will encompass the design and implementation of innovative features, fine-tuning and sustaining existing code for optimal performance, and guaranteeing top-notch quality through rigorous testing and debugging. Collaboration is at the heart of what we do, and youll be working closely with fellow developers, designers, and product managers to ensure our software aligns seamlessly with user expectations. Analyzes and designs software modules, features or components of software programs and develops related specifications. Develops, tests, documents and maintains complex software programs for assigned systems, applications and/or products. Gathers and evaluates software project requirements and apprises appropriate individual(s). Codes, tests and debugs new software or enhances existing software. Troubleshoots and resolves or recommends solutions to complex software problems. Provides senior level support and mentoring by evaluating product enhancements for feasibility studies and providing completion time estimates. Assists management with the planning, scheduling, and assigning of projects to software development personnel. Ensures product quality by participating in design reviews, code reviews, and other mechanisms. Participates in developing test procedures for system quality and performance. Writes and maintains technical documentation for assigned software projects. Provides initial input on new or modified product/application system features or enhancements for user documentation. Reviews user documentation for technical accuracy and completeness. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong knowledge of software development tools and methods; related software languages; test design and configuration; related systems, applications, products and services. Good written and verbal communication skills. Ability to test and analyze data and provide recommendations; to organize tasks and determine priorities; Ability to provide guidance to less experienced personnel. Enterprise applications using Java, J2EE and related technologies. Scripting languages like Python and Perl. Experience with backend technologies. (Spring, Hibernate, Kafka, SQL, REST APIs, Microservices, JSP, etc) Hands-on experience with Databases like Oracle or similar Familiarity with cloud computing services such as AWS, Azure, GCP. Experience in design and development of UI. Knowledge of different flavors of js (React, Angular, Node etc.) Preferred technical and professional experience Passion for mobile device technologies. Proven debugging and troubleshooting skills (memory, performance, battery usage, network usage optimization, etc)
Posted 1 day ago
2.0 - 5.0 years
4 - 7 Lacs
Mumbai
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modelling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 day ago
10.0 - 16.0 years
19 - 27 Lacs
Pune, Chennai, Bengaluru
Work from Office
UI/UX Architect (10+ Years) PAN India N.P: 30 days GCP , node.js ,react.js UI/UX Architect with 10+ years of experience in designing and leading frontend architecture across web and mobile platforms(mobile platforms is nice to have). Will define the UI/UX vision, architect scalable and secure solutions using React.js, React Native, and Node.js, and ensure seamless integration with GCP Cloud Monitoring and observability tools.Need to authenticate with the GCP services using SSO. Key Skills: Frontend architecture (React.js), Design Systems, Node.js UI/UX strategy and leadership Integration with GCP services (Cloud Monitoring, API Gateway, IAM) Design governance, performance optimization, accessibility (WCAG) Stakeholder collaboration, mentoring, code quality enforcement Nice to have skills:- React Native. Mobile app development.
Posted 1 day ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Website http://www.intraedge.com Cloud Data Engineer(GCP) Experience-4+ Notice period-max 15 days Location -Gurugram(Hybrid) We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based marketing ODL applications. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization’s data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Technical Skills 1. Core Data Engineering Skills Proficiency in using GCP’s big data tools like: BigQuery : For data warehousing and SQL analytics. Dataproc : For running Spark and Hadoop clusters. Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions . 2. Programming and Scripting Strong coding skills in SQL , and Java . Familiarity with APIs and SDKs for GCP services to build custom data solutions. 3. Cloud Infrastructure Understanding of GCP services such as Cloud Storage , Compute Engine , and Cloud Functions . Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have) 4. DevOps and CI/CD Experience setting up CI/CD pipelines using Cloud Build , GitHub Actions , or other tools. Monitoring and logging tools like Cloud Monitoring and Cloud Logging for production workflows. Soft Skills 1. Innovation and Problem-Solving Ability to think creatively and design innovative solutions for complex data challenges. Experience in prototyping and experimenting with cutting-edge GCP tools or third-party integrations. Strong analytical mindset to transform raw data into actionable insights. 2. Collaboration Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders. Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences. 3. Adaptability and Continuous Learning Open to exploring new GCP features and rapidly adapting to changes in cloud technology. Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Overview: As a Senior Technical Consultant at Hashmato, you will play a critical role in designing, implementing, and supporting technology solutions for clients. You will work closely with customers, internal teams, and partners to deliver high-quality technical services, ensuring optimal performance and alignment with business goals. Key Responsibilities: ✅ Understand client requirements and design appropriate technical solutions ✅ Implement, configure, and integrate Hashmato products or relevant technologies ✅ Provide technical support and troubleshooting for existing implementations ✅ Collaborate with development, QA, and product teams to deliver solutions ✅ Create technical documentation, user guides, and reports ✅ Assist in pre-sales technical activities including solution demos and proposals ✅ Conduct workshops, training, and knowledge transfer sessions for clients Skills & Qualifications: 1–3 years of experience in a technical consulting, software engineering, or solution implementation role Strong programming/scripting skills (e.g., Python, Java, Golang, or relevant to Hashmato’s stack) Familiarity with APIs, cloud platforms (AWS, Azure, GCP), and integration technologies Good understanding of system architecture, databases (SQL/NoSQL), and security best practices Strong problem-solving skills and ability to work independently and as part of a team Excellent communication and client-facing abilities Preferred: Prior experience with Hashmato products, SaaS platforms, or similar domains Experience in the POS (Point of Sale) industry is a strong advantage Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for Google Cloud Platform (GCP) professionals in India is rapidly growing as more and more companies are moving towards cloud-based solutions. GCP offers a wide range of services and tools that help businesses in managing their infrastructure, data, and applications in the cloud. This has created a high demand for skilled professionals who can work with GCP effectively.
The average salary range for GCP professionals in India varies based on experience and job role. Entry-level positions can expect a salary range of INR 5-8 lakhs per annum, while experienced professionals can earn anywhere from INR 12-25 lakhs per annum.
Typically, a career in GCP progresses from a Junior Developer to a Senior Developer, then to a Tech Lead position. As professionals gain more experience and expertise in GCP, they can move into roles such as Cloud Architect, Cloud Consultant, or Cloud Engineer.
In addition to GCP, professionals in this field are often expected to have skills in: - Cloud computing concepts - Programming languages such as Python, Java, or Go - DevOps tools and practices - Networking and security concepts - Data analytics and machine learning
As the demand for GCP professionals continues to rise in India, now is the perfect time to upskill and pursue a career in this field. By mastering GCP and related skills, you can unlock numerous opportunities and build a successful career in cloud computing. Prepare well, showcase your expertise confidently, and land your dream job in the thriving GCP job market in India.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.