Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
As a Business Analysis Expert at Digital Wolf, you will play a crucial role in collaborating with stakeholders to understand business needs and objectives. Your responsibilities will include analyzing complex data sets to identify trends, patterns, and opportunities. You will be tasked with developing business cases, process models, and requirements documentation to support project planning, execution, and performance tracking. Your insights and recommendations will be instrumental in guiding strategic decisions at the leadership level. Your day-to-day tasks will involve performing data analysis, improving business processes, and facilitating communication between various teams. To excel in this role, you must possess strong analytical skills, problem-solving abilities, and excellent communication skills. Proficiency in tools like Excel, SQL, tableau, Power BI, or similar platforms is essential. You should be adept at identifying and documenting business requirements, understanding business processes, systems, and data flow. Familiarity with Agile/Scrum Methodologies and similar tools will be beneficial in your role. The ability to work independently, manage multiple tasks effectively, and present findings and recommendations to leadership is key to success in this position. A Bachelor's degree in Business, Finance, Information Technology, or a related field is required. Prior experience in the digital marketing industry will be considered a plus. If you are passionate about leveraging data-driven insights to drive business growth and thrive in a dynamic and customer-centric environment, we invite you to join our team at Digital Wolf and contribute to our mission of empowering businesses to succeed in the digital landscape.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The Company Our beliefs are the foundation for how you conduct business every day. You live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that you work together as one global team with our customers at the center of everything you do and they push you to ensure you take care of yourselves, each other, and our communities. Job Description Summary: What you need to know about the role: A Business Systems Analyst passionate about delivering quality deliverables in a fast-paced environment with an undivided customer focus. Meet our team: The Finance Technology team consists of a diverse group of well-talented, driven, hive-minded subject matter experts that relentlessly work towards enabling the best-in-class solutions for our customers to transform current state solutions. You will work with this team to set up finance solutions, explore avenues to automate, challenge the status quo, and simplify the current state through transformation. Job Description: Your way to impact Your day to day: - Build scalable systems by leading discussions with the business, understanding the requirements from both Customer and Business, and delivering requirements to the engineering team to guide them in building a robust, scalable solution. - Have hands-on technical experience to support across multiple platforms (GCP, Python, Hadoop, SAP, Teradata, Machine Learning). - Establish a consistent project management framework and develop processes to deliver high-quality software in rapid iterations for business partners in multiple geographies. - Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools, etc. - Experience in balancing production platform stability, feature delivery, and the reduction of technical debt across a broad landscape of technologies. What Do You Need To Bring: - You have consistently high standards, and your passion for quality is inherent in everything you do. - Experience with GCP BQ, SQL, data flow. - 4+ years of relevant experience. - Data warehouses, Data marts, distributed data platforms, and data lakes. - Data Modeling, Schema design. - Reporting/Visualization Looker, Tableau, Power BI. - Knowledge of Statistical and machine learning models. - Excellent structured thinking skills, with the ability to break down multi-dimensional problems. - Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholders. - We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. Our Benefits: Who We Are: To learn more about our culture and community, visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. REQ ID R0115599,
Posted 2 months ago
5.0 - 10.0 years
16 - 31 Lacs
Pune
Hybrid
Software Engineer - Lead/Sr.Engineer Bachelor in Computer Science, Engineering, or equivalent experience 7+ years of experience in core JAVA, Spring Framework (Required) 2 years of Cloud experience (GCP, AWS, Azure, GCP preferred ) (Required) Experience in big data processing, on a distributed system. (required) Experience in databases RDBMS, NoSQL databases Cloud natives. (Required) Experience in handling various data formats like Flat file, jSON, Avro, xml etc with defining the schemas and the contracts. (required) Experience in implementing the data pipeline (ETL) using Dataflow (Apache beam) Experience in Microservices and integration patterns of the APIs with data processing. Experience in data structure, defining and designing the data models.
Posted 2 months ago
2.0 - 5.0 years
3 - 6 Lacs
Mumbai
Work from Office
Skill required: Tech for Operations - Automation Anywhere Designation: App Automation Eng Analyst Qualifications: BE Years of Experience: 3 to 5 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationAutomate any process end-to-end with cognitive software robots using the robotic process automation software, Automation Anywhere Enterprise. What are we looking for Adaptable and flexibleAbility to perform under pressureProblem-solving skillsAbility to establish strong client relationshipAgility for quick learningThis request is raised for Contract Conversion Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualification BE
Posted 2 months ago
3.0 - 6.0 years
5 - 9 Lacs
Chennai
Work from Office
Description Analyzing and translating business needs into long-term solution data models. Evaluating existing data systems. Working with the development team to create conceptual data models and data flows. Developing best practices for data coding to ensure consistency within the system. Reviewing modifications of existing systems for cross-compatibility. Implementing data strategies and developing physical data models. Updating and optimizing local and metadata models. Evaluating implemented data systems for variances, discrepancies, and efficiency. Maintain logical and physical data models along with accurate metadata. Analyze data-related system integration challenges and propose appropriate solutions with strategic approach Should have Strong knowledge in Databases, cloud technologies, Data Valut Architecture.
Posted 2 months ago
2.0 - 7.0 years
5 - 9 Lacs
Gurugram
Work from Office
Who we are: R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, amongst Top 50 Best Workplaces for Millennials, Top 50 for Women, Top 25 for Diversity and Inclusion and Top 10 for Health and Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 17,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. About the role: Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL Connecting to data sources, importing data and transforming data for Business Intelligence. Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data. Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision. r1rcm.com Facebook
Posted 2 months ago
2.0 - 7.0 years
3 - 7 Lacs
Noida
Work from Office
Who we are: R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, amongst Top 50 Best Workplaces for Millennials, Top 50 for Women, Top 25 for Diversity and Inclusion and Top 10 for Health and Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 17,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. About the role: Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL Connecting to data sources, importing data and transforming data for Business Intelligence. Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data. Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision. r1rcm.com Facebook
Posted 2 months ago
8.0 - 13.0 years
7 - 12 Lacs
Pune
Work from Office
: Job TitleBusiness Functional Analyst Corporate TitleAssociate LocationPune, India Role Description Business Functional Analysis is responsible for business solution design in complex project environments (e.g. transformational programmes). Work includes: Identifying the full range of business requirements and translating requirements into specific functional specifications for solution development and implementation Analysing business requirements and the associated impacts of the changes Designing and assisting businesses in developing optimal target state business processes Creating and executing against roadmaps that focus on solution development and implementation Answering questions of methodological approach with varying levels of complexity Aligning with other key stakeholder groups (such as Project Management & Software Engineering) to support the link between the business divisions and the solution providers for all aspects of identifying, implementing and maintaining solutions What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Write clear and well-structured business requirements/documents. Convert roadmap features into smaller user stories. Analyse process issues and bottlenecks and to make improvements. Communicate and validate requirements with relevant stakeholders. Perform data discovery, analysis, and modelling. Assist with project management for selected projects. Understand and translate business needs into data models supporting long-term solutions. Understand existing SQL/Python code convert to business requirement. Write advanced SQL and Python scripts. Your skills and experience A minimum of 8+ years of experience in business analysis or a related field. Exceptional analytical and conceptual thinking skills. Proficient in SQL. Proficient in Python for data engineering. Experience in automating ETL testing using python and SQL. Exposure on GCP services corresponding cloud storage, data lake, database, data warehouse; like Big Query, GCS, Dataflow, Cloud Composer, gsutil, Shell Scripting etc. Previous experience in Procurement and Real Estate would be plus. Competency in JIRA, Confluence, draw i/o and Microsoft applications including Word, Excel, Power Point an Outlook. Previous Banking Domain experience is a plus. Good problem-solving skills How well support you .
Posted 2 months ago
7.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
7.0 - 12.0 years
16 - 20 Lacs
Pune
Work from Office
: Job TitleData Engineer (ETL, Big Data, Hadoop, Spark, GCP), AS Location:Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Hands-on experience for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 7+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. How well support you . . .
Posted 2 months ago
15.0 - 20.0 years
32 - 40 Lacs
Pune
Work from Office
: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
0.0 - 3.0 years
6 - 8 Lacs
Noida
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 2 months ago
0.0 - 1.0 years
8 - 10 Lacs
Hyderabad
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 2 months ago
4.0 - 8.0 years
22 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 2 months ago
7.0 - 12.0 years
25 - 27 Lacs
Hyderabad
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 2 months ago
6.0 - 11.0 years
12 - 22 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 6+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist
Posted 2 months ago
1.0 - 2.0 years
3 - 6 Lacs
Dhule
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 2 months ago
10.0 - 15.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Work Requirements Minimum Bachelors Degree required with 6-8 years experience in Oracle EBS R12 1. 6+ years of experience with Oracle applications e-business suite (11i or R12) as a Techno Functional Consultant. 2. 6+ years of experience with SQL and PL/SQL and SQL tuning including SQL and PL/SQL development tools 3. 5+ years of experience with Oracle Forms and Oracle Reports 4. 5+ years of experience with XML/BI publisher 5. 2+ years of experience with Oracle Workflow Builder 6. 2+ years of experience with a Unix Shell scripting. Specific Work Preferences 1. Knowledge of general business operating principles. 2. Advanced troubleshooting skills 3. Ability to multitask and maintain composure when working with the business users. 4. Good Techno Functional knowledge of Order to Cash Process (O2C) and Procure to Pay process (P2P). 5. Technical expertise with solid understanding of underlying data flow and functionality in Oracle modules likes Inventory, Shipping Execution, Order Management, Purchasing, iProcurement, WIP and BOM. 6. Good technical and functional knowledge in Supply Chain modules. 7. Expertise in Forms Personalization and customization. 8. Knowledge of Oracle Application Framework (OAF) and ADF a plus. 9. Knowledge of Oracle Mobile Forms development is a plus 10. Knowledge of Application Object Library 11. Excellent analytical and problem-solving skills 12. Excellent verbal and written communication skills
Posted 2 months ago
1.0 - 2.0 years
3 - 5 Lacs
Ahmedabad
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 2 months ago
13.0 - 17.0 years
32 - 35 Lacs
Noida, Gurugram
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 2 months ago
3.0 - 5.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 2 months ago
2.0 - 7.0 years
1 - 6 Lacs
Hyderabad, Qatar
Work from Office
SUMMARY Job Summary: Exciting job opportunity as a Registered Nurse in Qatar (Homecare) Key Responsibilities: Develop and assess nursing care plans Monitor vital signs and assess holistic patient needs Collaborate with physicians, staff nurses, and healthcare team members Administer oral and subcutaneous medications while ensuring safety Document nursing care, medications, and procedures using the company's Nurses Buddy application Conduct client assessment and reassessment using approved tools Attend refresher training courses, seminars, and training Timeline for Migration: Application to Selection: Not more than 5 days Data flow & Prometric: 1 month Visa processing: 1-2 months Start working in Qatar within 3 months! Requirements: Educational Qualification: Bachelor's Degree in Nursing or GNM Experience: Minimum 2 years working experience as a Nurse post registration Certification: registration Certification from Nursing Council Language: Basic English proficiency required Technical Skills: Bed side nursing, patient care, patient assessment and monitoring Benefits: High Salary & Perks: Earn 5000 QAR / month (1,18,000 INR/month) Tax Benefit: No tax deduction on salary Career Growth: Advanced Nursing career in Qatar with competitive salaries, cutting-edge facilities, and opportunities for specialization Relocation support: Visa process and flight sponsored. Free accommodation and transportation provided. International Work Experience: Boost your resume with International healthcare expertise. Comprehensive Health Insurance: Medical coverage for under Qatar’s healthcare system. S afe and stable environment: Qatar is known for its low crime rate, political stability, and high quality of life. The strict laws in the country, makes it one of safest place to live. Faster Visa Processing With efficient government procedures, work visas for nurses are processed quickly, reducing waiting times. Simplified Licensing Process Compared to other countries, Qatar offers a streamlined process for obtaining a nursing license through QCHP (Qatar Council for Healthcare Practitioners) . Direct Hiring Opportunities Many hospitals and healthcare facilities offer direct recruitment , minimizing third-party delays and complications. Limited slots available! Apply now to secure your place in the next batch of Nurses migrating to Qatar!
Posted 2 months ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
GCP Engineer GCP developer should have the expertise on the components like Scheduler, DataFlow, BigQuery, Pub/Sub and Cloud SQL etc. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience5-8 Years.
Posted 2 months ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Apache Nifi 5+ years of hands-on experience with Apache NiFi, including developing, managing, and optimizing complex data flows in production environments. Proven experience with Cloudera NiFi (CDP Data Flow) in enterprise environments, including integration with Cloudera Manager. Experience migrating NiFi flows across major version upgrades with strong understanding of backward compatibility Strong proficiency in Groovy scripting, used for ExecuteScript and InvokeScriptedProcessor processors. Solid understanding of SSH and SFTP protocols, including authentication schemes (key-based, password), session negotiation, and file permissions handling in NiFi processors (e.g., ListSFTP, FetchSFTP, PutSFTP). Good grasp of data encryption mechanisms, key management, and secure flowfile handling using processors like EncryptContent. Experience integrating NiFi with MongoDB, including reading/writing documents via processors like GetMongo, PutMongo, and QueryMongo. Experience working with Apache Kafka, including producing and consuming from Kafka topics using NiFi (PublishKafka, ConsumeKafka), and handling schema evolution with Confluent Schema Registry. Strong knowledge of Red Hat Enterprise Linux (RHEL) environments, including systemd services, filesystem permissions, log rotation, and resource tuning for JVM-based applications like NiFi. NiFi-Specific Technical Requirements: In-depth knowledge of NiFi flow design principles, including proper use of queues, back pressure, prioritizers, and connection tuning. Mastery of controller services, including SSLContextService, DBCPConnectionPool, and RecordReader/RecordWriter services. Experience with Record-based processing using Avro, JSON, CSV schemas and Record processors like ConvertRecord, QueryRecord, and LookupRecord. Ability to debug and optimize NiFi flows using Data Provenance, bulletins, and log analysis. Familiarity with custom processor development in Java/Groovy (optional but preferred). Experience setting up secure NiFi clusters, configuring user authentication (LDAP, OIDC), TLS certificates, and access policies. Proficiency in parameter contexts, variable registry, and flow versioning using NiFi Registry. Understanding of Zero-Master clustering model, node coordination, and site-to-site protocol. Experience deploying and monitoring NiFi in high-availability, production-grade environments, including using Prometheus/Grafana or Cloudera Manager for metrics and alerting. Preferred Qualifications: Experience working in regulated or secure environments, with strict data handling and audit requirements. Familiarity with DevOps workflows, including version-controlled flow templates (JSON/XML), CI/CD integration for NiFi Registry, and automated deployment strategies. Strong written and verbal communication skills, with ability to document flows and onboard other engineers.
Posted 2 months ago
15.0 - 20.0 years
4 - 8 Lacs
Mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |