Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7 - 12 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the application development process.- Implement best practices for application design and development.- Conduct code reviews and ensure code quality standards are met.- Mentor junior team members to enhance their skills. Professional & Technical Skills:- Must To Have Skills:Proficiency in Apache Spark.- Good To Have Skills:Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery.- Strong understanding of distributed computing and parallel processing.- Experience in developing scalable and high-performance applications using Apache Spark.- Knowledge of data processing frameworks and tools in the big data ecosystem. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 month ago
12 - 17 years
14 - 19 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : PySpark, Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented according to the specified requirements and standards. Your typical day will involve collaborating with the team to understand the business needs, designing and developing applications using Apache Spark, and configuring the applications to meet the required functionality. You will also be involved in testing and debugging the applications to ensure their quality and performance. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Design and build applications based on business process and application requirements. Configure applications to meet the required functionality. Collaborate with the team to understand the business needs and translate them into technical requirements. Test and debug applications to ensure their quality and performance. Provide technical guidance and support to the team. Stay updated with the latest technologies and trends in application development. Identify and resolve any issues or challenges in the application development process. Ensure that the applications are developed and implemented according to the specified requirements and standards. Professional & Technical Skills: Must To Have Skills:Proficiency in Apache Spark, PySpark, Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 12 years of experience in Apache Spark. This position is based at our Gurugram office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 month ago
12 - 17 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education We are seeking a highly motivated and experienced DevOps Infra Engineer to join our team and manage Kubernetes (K8) infrastructure. You will be responsible for implementing and maintaining Infrastructure as Code (IaC) using Terraform or relevant code and ensuring the smooth deployment and management of our Kubernetes Stack in On Prem environments. You will also be instrumental in troubleshooting issues, optimizing infrastructure, and implementing / Managing monitoring tools for observability. Primary Skills: Kubernetes, Kubegres, Kubekafka, Grafana, Redis, PrometheusSecondary Skills: Keycloak,MetalLB,Ingress,ElasticSearch,Superset,OpenEBS,Istio,Secrets,Helm,NussKnacker,Valero,DruidResponsibilities:Containerization:Working experience with Kubernetes and Docker for containerized application deployments in On Prem ( GKE/K8s ). Knowledge of Helm charts and their application in Kubernetes clusters. Collaboration and Communication:Work effectively in a collaborative team environment with developers, operations, and other stakeholders. Communicate technical concepts clearly and concisely. CI/CD:Design and implement CI/CD pipelines using Jenkins, including pipelines, stages, and jobs. Utilize Jenkins Pipeline and Groovy scripting for advanced pipeline automation. Integrate Terraform with Jenkins for IaC management and infrastructure provisioning. Infrastructure as Code (IaC):Develop and manage infrastructure using Terraform, including writing Terraform tfvars and modules code. Set up IaC pipelines using Terraform, Jenkins, and cloud environments like Azure and GCP. Troubleshoot issues in Terraform code and ensure smooth infrastructure deployments. Cloud Platforms:Possess a deep understanding of both Google Cloud and Azure cloud platforms. Experience with managing and automating cloud resources in these environments. Monitoring & Logging:Configure and manage monitoring tools like Splunk, Grafana, and ELK for application and infrastructure health insights. GitOps:Implement GitOps practices for application and infrastructure configuration management. Scripting and Automation:Proficient in scripting languages like Python and Bash for automating tasks. Utilize Ansible or Chef for configuration management. Configuration Management:Experience with configuration management tools like Ansible and Chef. Qualifications:4-9 years of experience as a Kubernetes & DevOps Engineer or similar role with 12+ years of total experience in Cloud and Infra managed services. Strong understanding of CI/CD principles and practices. Proven experience with Jenkins or CI/CD, including pipelines, scripting, and plugins. Expertise in Terraform and IaC principles. Experience with Kubernetes management in On Prem platform is preferred. Exposure with monitoring and logging tools like Splunk, Grafana, or ELK. Experience with GitOps practices. Proficiency in scripting languages like Python and Bash. Experience with configuration management tools like Ansible or Chef. Hands-on experience with Kubernetes and Docker. Knowledge of Helm charts and their application in Kubernetes clusters. Must:Flexible to cover a part of US working Hours ( 24/7 business requirement ).Excellent communication and collaboration skills. Fluent in English.
Posted 1 month ago
3 - 8 years
3 - 7 Lacs
Bengaluru
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Google Kubernetes Engine Good to have skills : Kubernetes, Google Cloud Compute Services Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education About The Role :Job Summary :We are seeking a motivated and talented GCP & Kubernetes Engineer to join our growing cloud infrastructure team. This role will be a key contributor in building and maintaining our Kubernetes platform, working closely with architects to design, deploy, and manage cloud-native applications on Google Kubernetes Engine (GKE).Responsibilities: Extensive hands-on experience with Google Cloud Platform (GCP) and Kubernetes implementations. Demonstrated expertise in operating and managing container orchestration engines such as Dockers or Kubernetes. Knowledge or experience on various Kubernetes tools like Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus Proven track record in supporting and deploying various public cloud services. Experience in building or managing self-service platforms to boost developer productivity. Proficiency in using Infrastructure as Code (IaC) tools like Terraform. Skilled in diagnosing and resolving complex issues in automation and cloud environments. Advanced experience in architecting and managing highly available and high-performance multi-zonal or multi-regional systems. Strong understanding of infrastructure CI/CD pipelines and associated tools. Collaborate with internal teams and stakeholders to understand user requirements and implement technical solutions. Experience working in GKE, Edge/GDCE environments. Assist development teams in building and deploying microservices-based applications in public cloud environments.Technical Skillset: Minimum of 3 years of hands-on experience in migrating or deploying GCP cloud-based solutions. At least 3 years of experience in architecting, implementing, and supporting GCP infrastructure and topologies. Over 3 years of experience with GCP IaC, particularly with Terraform, including writing and maintaining Terraform configurations and modules. Experience in deploying container-based systems such as Docker or Kubernetes on both private and public clouds (GCP GKE). Familiarity with CI/CD tools (e.g., GitHub) and processes.Certifications: GCP ACE certification is mandatory. CKA certification is highly desirable. HashiCorp Terraform certification is a significant plus.
Posted 1 month ago
7 - 12 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education About The Role :We are looking for an experienced Kubernetes Architect to join our growing cloud infrastructure team. This role will be responsible for architecting, designing, and implementing scalable, secure, and highly available cloud-native applications on Kubernetes. You will leverage Kubernetes along with associated technologies like Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus to build resilient systems that meet both business and technical needs. Google Kubernetes Engine (GKE) will be considered as an additional skill. As a Kubernetes Architect, you will play a key role in defining best practices, optimizing the infrastructure, and providing architectural guidance to cross-functional teams Key Responsibilities: Architect Kubernetes Solutions:Design and implement scalable, secure, and high-performance Kubernetes clusters. Cloud-Native Application Design:Collaborate with development teams to design cloud-native applications, ensuring that microservices are properly architected and optimized for Kubernetes environments. Kafka Management:Architect and manage Apache Kafka clusters using Kubekafka, ensuring reliable, real-time data streaming and event-driven architectures. Database Architecture:Use Kubegres to manage high-availability PostgreSQL clusters in Kubernetes, ensuring data consistency, scaling, and automated failover. Helm Chart Development:Create, maintain, and optimize Helm charts for consistent deployment and management of applications across Kubernetes environments. Ingress & Networking:Architect and configure Ingress controllers (e.g., NGINX, Traefik) for secure and efficient external access to Kubernetes services, including SSL termination, load balancing, and routing. Caching and Performance Optimization:Leverage Redis to design efficient caching and session management solutions, optimizing application performance. Monitoring & Observability:Lead the implementation of Prometheus for metrics collection and Grafana for building real-time monitoring dashboards to visualize the health and performance of infrastructure and applications. CI/CD Integration:Design and implement continuous integration and continuous deployment (CI/CD) pipelines to streamline the deployment of Kubernetes-based applications. Security & Compliance:Ensure Kubernetes clusters follow security best practices, including RBAC, network policies, and the proper configuration of Secrets Management. Automation & Scripting:Develop automation frameworks using tools like Terraform, Helm, and Ansible to ensure repeatable and scalable deployments. Capacity Planning and Cost Optimization:Optimize resource usage within Kubernetes clusters to achieve both performance and cost-efficiency, utilizing cloud tools and services. Leadership & Mentorship:Provide technical leadership to development, operations, and DevOps teams, offering mentorship, architectural guidance, and sharing best practices. Documentation & Reporting:Produce comprehensive architecture diagrams, design documents, and operational playbooks to ensure knowledge transfer across teams and maintain system reliability Required Skills & Experience: 10+ years of experience in cloud infrastructure engineering, with at least 5+ years of hands-on experience with Kubernetes. Strong expertise in Kubernetes for managing containerized applications in the cloud. Experience in deploying & managing container-based systems on both private and public clouds (Google Kubernetes Engine (GKE)). Proven experience with Kubekafka for managing Apache Kafka clusters in Kubernetes environments. Expertise in managing PostgreSQL clusters with Kubegres and implementing high-availability database solutions. In-depth knowledge of Helm for managing Kubernetes applications, including the development of custom Helm charts. Experience with Ingress controllers (e.g., NGINX, Traefik) for managing external traffic in Kubernetes. Hands-on experience with Redis for caching, session management, and as a message broker in Kubernetes environments. Advanced knowledge of Prometheus for monitoring and Grafana for visualization and alerting in cloud-native environments. Experience with CI/CD pipelines for automated deployment and integration using tools like Jenkins, GitLab CI, or CircleCI. Solid understanding of networking, including load balancing, DNS, SSL/TLS, and ingress/egress configurations in Kubernetes. Familiarity with Terraform and Ansible for infrastructure automation. Deep understanding of security best practices in Kubernetes, such as RBAC, Network Policies, and Secrets Management. Knowledge of DevSecOps practices to ensure secure application delivery.Certifications:oGoogle Cloud Platform (GCP) certification is mandatory.oKubernetes Certification (CKA, CKAD, or CKAD) is highly preferred.oHashiCorp Terraform certification is a significant plus.
Posted 1 month ago
3 - 8 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : GCP Dataflow Good to have skills : Google BigQuery Minimum 3 year(s) of experience is required Educational Qualification : Any Graduate Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems using GCP Dataflow. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing using GCP Dataflow. Create data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Collaborate with cross-functional teams to identify and resolve data-related issues. Develop and maintain documentation related to data solutions and processes. Stay updated with the latest advancements in data engineering technologies and integrate innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience in GCP Dataflow. Good To Have Skills:Experience in Google BigQuery. Strong understanding of ETL processes and data migration. Experience in data modeling and database design. Experience in data warehousing and data lake concepts. Experience in programming languages such as Python, Java, or Scala. Additional Information: The candidate should have a minimum of 3 years of experience in GCP Dataflow. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Indore office. Qualification Any Graduate
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : Microsoft 365 Good to have skills : Microsoft PowerShell, Microsoft 365 Security & Compliance Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Integration Engineer, you will provide consultative Business and System Integration services to help clients implement effective solutions. You will understand and translate customer needs into business and technology solutions, drive discussions, consult on transformation, the customer journey, functional/application designs, and ensure technology and business solutions represent business requirements. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement integration solutions for clients. Collaborate with cross-functional teams to ensure successful project delivery. Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft 365. Good To Have Skills:Experience with Microsoft PowerShell, Microsoft 365 Security & Compliance. Strong understanding of cloud-based integration technologies. Experience in designing and implementing scalable integration solutions. Knowledge of API integration and data mapping techniques. Additional Information: The candidate should have a minimum of 3 years of experience in Microsoft 365. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
5 - 10 years
8 - 13 Lacs
Bengaluru
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Job Summary :We are looking for an experienced Kubernetes Specialist to join our cloud infrastructure team. You will work closely with architects and engineers to design, implement, and optimize cloud-native applications on Google Kubernetes Engine (GKE). This role will focus on providing expertise in Kubernetes, container orchestration, and cloud infrastructure management, ensuring the seamless operation of scalable, secure, and high-performance applications on GKE and other cloud environments.________________________________________Responsibilities: Kubernetes Implementation:Design, implement, and manage Kubernetes clusters for containerized applications, ensuring high availability and scalability. Cloud-Native Application Design:Work with teams to deploy, scale, and maintain cloud-native applications on Google Kubernetes Engine (GKE). Kubernetes Tools Expertise:Utilize Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus to build and maintain resilient systems. Infrastructure Automation:Develop and implement automation frameworks using Terraform and other tools to streamline Kubernetes deployments and cloud infrastructure management. CI/CD Implementation:Design and maintain CI/CD pipelines to automate deployment and testing for Kubernetes-based applications. Kubernetes Networking & Security:Ensure secure and efficient Kubernetes cluster networking, including Ingress controllers (e.g., NGINX, Traefik), RBAC, and Secrets Management. Monitoring & Observability:Lead the integration of monitoring solutions using Prometheus for metrics and Grafana for real-time dashboard visualization. Performance Optimization:Optimize resource utilization within GKE clusters, ensuring both performance and cost-efficiency. Collaboration:Collaborate with internal development, operations, and security teams to meet user requirements and implement Kubernetes solutions. Troubleshooting & Issue Resolution:Address complex issues related to containerized applications, Kubernetes clusters, and cloud infrastructure, troubleshooting and resolving them efficiently.________________________________________Technical Skillset: GCP & Kubernetes Experience:Minimum of 3+ years of hands-on experience in Google Cloud Platform (GCP) and Kubernetes implementations, including GKE. Container Management:Proficiency with container orchestration engines such as Kubernetes and Docker. Kubernetes Tools Knowledge:Experience with Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus for managing Kubernetes-based applications. Infrastructure as Code (IaC):Strong experience with Terraform for automating infrastructure provisioning and management. CI/CD Pipelines:Hands-on experience in building and managing CI/CD pipelines for Kubernetes applications using tools like Jenkins, GitLab, or CircleCI. Security & Networking:Knowledge of Kubernetes networking (DNS, SSL/TLS), security best practices (RBAC, network policies, and Secrets Management), and the use of Ingress controllers (e.g., NGINX) Cloud & DevOps Tools:Familiarity with cloud services and DevOps tools such as GitHub, Jenkins, and Ansible. Monitoring Expertise:In-depth experience with Prometheus and Grafana for operational monitoring, alerting, and creating actionable insights. Certifications: Google Cloud Platform (GCP) Associate Cloud Engineer (ACE) certification is required. Certified Kubernetes Administrator (CKA) is highly preferred.
Posted 1 month ago
3 - 5 years
5 - 7 Lacs
Jaipur
Work from Office
Skill required: Procure to Pay - Invoice Processing Designation: Procure to Pay Operations Analyst Qualifications: BCom/MCom Years of Experience: 3 to 5 years What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.boosting vendor compliance, cutting savings erosion, improving discount capture using preferred suppliers, and in confirming pricing and terms prior to payment. Responsible for accounting of goods and services, through requisitioning, purchasing and receiving. Also look after order sequence of procurement and financial process end to end. The Accounts Payable Processing team focuses on designing, implementing, managing and supporting accounts payable activities by applying the relevant processes, policies and applications. The team is responsible for timely and accurate billing and processing of invoices, managing purchase and non-purchase orders and two-way and three-way matching of invoices.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Good Verbal and written Communication SkillsWell versed with Accounts payable cycle , sub processes & terminologies Good Understanding of PO vs Non PO InvoicesGood understanding of withholding taxes treatments in invoice processingGood understanding of Month end/ quarter end/ year end closing along with acccruals and reportingGood understanding of accounting Journal entriesUnderstanding of employee expense claim processingUnderstanding of expenses accruals Vendors ReconciliationsReporting and audit assignmentsWorking knowledge of MS OfficeProblem Solving attitudeTeam working and co-ordinationReady to work in night shiftsKnowledge of Invoice processing toolsKnowledge of current technologies in PTP domainAnalytical skillUnderstanding of RPAs Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualifications BCom,MCom
Posted 1 month ago
12 - 17 years
14 - 19 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark, Python (Programming Language), Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented efficiently and effectively, while meeting the needs of the organization. Your typical day will involve collaborating with the team, making team decisions, and engaging with multiple teams to contribute to key decisions. You will also be expected to provide solutions to problems that apply across multiple teams, showcasing your expertise and problem-solving skills. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Ensure efficient and effective development and implementation of applications Design and build applications to meet business process and application requirements Contribute to the decision-making process and provide valuable insights Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark Good To Have Skills:Experience with Apache Spark, Python (Programming Language), Google BigQuery Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 12 years of experience in PySpark This position is based at our Bengaluru office A 15 years full time education is required Qualifications 15 years full time education
Posted 1 month ago
3 - 5 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Google BigQuery, SSI: NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements :Key Responsibilities :1:Assists with the data platform blueprint and design, encompassing the relevant data platform components.2:Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.3:The Data Engineer performs tasks such as data modeling, data pipeline build, data lake build, scalable programming frameworks Technical Experience :1:Expert in Python - NO FLEX. Strong hands-on- knowledge in SQL - NO FLEX, Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills2:Exp with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes NO FLEX3:Pro with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline Professional Attributes :1:Good communication 2:Good Leadership skills and team handling skills 3:Analytical skills, presentation skills, ability to work under pressure 4:Should be able to work in shifts whenever required Educational Qualification:Additional Info : Qualification 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Dataproc, Google Pub/Sub Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education br/>Key Responsibilities :A:Implement and maintain data engineering solutions using BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Collaborate with data scientists to deploy machine learning modelsC:Ensure the scalability and efficiency of data processing pipelines br/> Technical Experience :A:Expertise in BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Hands-on experience with data engineering in a cloud environment br/> Professional Attributes :A:Strong problem-solving skills in optimizing data workflowsB:Effective collaboration with data science and engineering teams Qualifications 15 years full time education
Posted 1 month ago
16 - 25 years
18 - 27 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Sr Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationIn Artificial Intelligence, you will be enhancing business results by using AI tools and techniques to performs tasks such as visual perception, speech recognition, decision-making, and translation between languages etc. that requires human intelligence. What are we looking for? Machine Learning Process-orientation Thought leadership Commitment to quality Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation
Posted 1 month ago
2 - 4 years
5 - 8 Lacs
Pune
Work from Office
We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.
Posted 1 month ago
12 - 17 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark, Python (Programming Language), Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Lead the team in implementing PySpark solutions effectively Conduct code reviews and ensure adherence to best practices Provide technical guidance and mentorship to junior team members Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark, Python (Programming Language), Apache Spark, Google BigQuery Strong understanding of distributed computing and parallel processing Experience in optimizing PySpark jobs for performance Knowledge of data processing and transformation techniques Familiarity with cloud platforms for deploying PySpark applications Additional Information: The candidate should have a minimum of 12 years of experience in PySpark This position is based at our Gurugram office A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Btech Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze requirements. - Design, develop, and test applications based on business needs. - Troubleshoot and debug applications to ensure optimal performance. - Implement security and data protection measures. - Document technical specifications and user guides. - Stay up-to-date with emerging technologies and industry trends. Professional & Technical Skills: - Must To Have Skills:Proficiency in Google BigQuery. - Strong understanding of SQL and database concepts. - Experience with data modeling and schema design. - Knowledge of ETL processes and data integration techniques. - Familiarity with cloud platforms such as Google Cloud Platform. - Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Hyderabad office. - A Btech degree is required. Qualifications Btech
Posted 1 month ago
4 - 9 years
16 - 31 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects. Ensure delivery within schedule by adhering to the engineering and quality standards. Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 24x7 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working. Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Preferred candidate profile
Posted 1 month ago
7 - 12 years
13 - 17 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 month ago
1 - 3 years
3 - 6 Lacs
Bengaluru
Work from Office
Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom
Posted 1 month ago
7 - 10 years
16 - 21 Lacs
Mumbai
Work from Office
Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.
Posted 1 month ago
4 - 9 years
10 - 14 Lacs
Pune
Hybrid
Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.
Posted 1 month ago
4 - 7 years
6 - 16 Lacs
Bengaluru
Work from Office
Senior Software Engineer Google Cloud Platform (GCP) Location: Bangalore, India Why Join Fossil Group? At Fossil Group, we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity, Grit, Curiosity, Humor, and Impact. If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. Make an Impact (Job Summary + Responsibilities) We are looking for a Senior Software Engineer – GCP to join our growing Cloud & Data Engineering team at Fossil Group . This role involves building scalable cloud-native data pipelines using Google Cloud Platform services, with a focus on Dataflow, Dataproc, BigQuery , and strong development skills in Java, Python, and SQL . You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. What you will do in this role: Design, develop, deploy, and maintain data pipelines and services using GCP technologies including Dataflow, Dataproc, BigQuery, Composer , and others. Translate blueprinting documents and business requirements into scalable and maintainable GCP configurations and solutions. Develop and enhance cloud-based batch/streaming jobs using Java or Python. Collaborate with global architects and cross-functional teams to define solutions and execute projects across development and testing environments. Perform unit testing, integration testing, and resolve issues arising during the QA lifecycle. Work closely with internal stakeholders to gather requirements, present technical solutions, and provide end-to-end delivery. Own and manage project timelines, priorities, and documentation. Continuously improve processes and stay current with GCP advancements and big data technologies. Who You Are (Requirements) Bachelor's degree in Computer Science or related field. 4-7 years of experience as a DB/SQL Developer or Java/Python Developer with strong SQL capabilities. Hands-on experience with GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion . Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in Java and/or Python , specifically for building cloud-native data pipelines. Experience with relational and NoSQL databases. Ability to understand and translate business requirements into functional specifications. Familiarity with BI dashboards and Google Data Studio is a plus. Strong problem-solving, communication, and collaboration skills. Self-directed, with a growth mindset and eagerness to upskill in emerging GCP technologies. Comfortable leading meetings, gathering requirements, and managing stakeholder communication across regions. What We Offer Comprehensive Benefits: Includes health and well-being services. Paid Parental Leave & Return to Work Program: Support for new parents and caregivers with paid leave and a flexible phase-back schedule. Generous Paid Time Off: Includes Sick Time, Personal Days, and Summer Flex Fridays. Employee Discounts: Save on Fossil merchandise. EEO Statement At Fossil, we believe our differences not only make us stronger as a team, but also help us create better products and a richer community. We are an Equal Employment Opportunity Employer dedicated to a policy of non-discrimination in all employment practices without regard to age, disability, gender identity or expression, marital status, pregnancy, race, religion, sexual orientation, or any other protected characteristic.
Posted 1 month ago
4 - 9 years
22 - 30 Lacs
Pune
Hybrid
Primary Skills: SQL (Data Analysis and Development) Alternate Skills: Python, Sharepoint, AWS , ETL, Telecom specially Fixed Network domain. Location: Pune Working Persona: Hybrid Experience: 4 to 10 years Core competencies, knowledge and experience: Essential: Strong SQL experience - Advanced level of SQL Excellent data interpretation skills Good knowledge of ETL and a business intelligence, good understanding of range of data manipulation and analysis techniques Working knowledge of large information technology development projects using methodologies and standards Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel, should be able to interact with business team and share the ideas. Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. Ability to work under pressure to tight deadlines. Hands on experience working with large datasets. Able to manage different stakeholders.
Posted 1 month ago
10 - 20 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi, Hope you are looking for a job change. We have opening for GCP Data Architect for an MNC in Pan India Location, I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 10 Days. Its 5-Days Work from Office . We dont process High-Notice period Candidates Role GCP Data Architect Experience: 10+ Years Mode: Permanent Work Location: Pan India Notice Period: immediate to 10 Days Mandatory Skills: : GCP, Architecture Experience , Big Data, Data Modelling , BigQuery Full Name ( As Per Aadhar Card ): Email ID: Mobile Number: Alternate No: Qualification: Graduation Year: Regular Course: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company:: Experience in GCP : Experience in Architecture : Experience in GCP Data Architecture : Experience in Big Data: Experience in BigQuery : Experience in Data Management : Official Notice period: Serving Notice Period: Current location: Preferred location: Current CTC: Exp CTC: CTC Breakup: Pan Card Number : Date of Birth : Any Offer in Hand: Offered CTC: LWD: Any Offer in hand: Serving Notice Period: Can you join immediately: Ready to work from Office for 5 Days : Job Description: GCP Data Architect We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices. Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91 - 9188336668
Posted 1 month ago
5 - 9 years
10 - 20 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Mandatory Skills: GCS Composer, BigQuery, Azure, Azure Databricks, ADLS. Experience: 5-10 years Good to have skills Knowledge on CDP (customer data platform) Airline knowledge Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, airflow Experience in GCP Cloud Composer, Big Query, DataProc Offer system support as part of a support rotation with other team members. Operationalize open source data-analytic tools for enterprise use. Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification. Understand and follow the company development lifecycle to develop, deploy and deliver the solutions. Minimum Qualifications: Bachelor's degree in Computer Science, CIS, or related field 5-7 years of IT experience in software engineering or related field Experience on project(s) involving the implementation of software development life cycles (SDLC)Core Skills
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane