Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
4 - 7 Lacs
Noida, Gurugram, Bengaluru
Work from Office
What youll do : Strong understanding of data management, data cataloguing, and data governance best practice. Enterprise data integration and management experience working with data management, EDW technologies and data governance solutions. The ideal data governance & data Catalog lead will call on their expertise in master data management (#MDM), data governance, and data quality control to effectively oversee the data elements of a complex product catalog. Showcasing thorough understanding of design and developing data catalog & data assets on industry known leading tool (Open-source catalog tool, Informatica Cloud data catalog, Alation, Collibra or Atlan) that would be the inventory of collective data assets to help data owners, stewards, and business users to discover relevant data for analytics and reporting. Must have experience on Collibra, Data Quality experience, including executing at least 2 large Data Governance, Quality projects from inception to production, working as technology expert. Must have 5+ years of practical experience configuring data governance resources including business glossaries, resources, dashboards, policies, search. Management of Enterprise Glossary through the review of common business terms and definitions and continuous assessments to ensure data adheres to Data Governance Standards, Development and configuration of Collibra/Alation data catalog resources, data lineage, custom resources, custom data lineage, relationships, data domains, data domain groups and composite data domains. Implement Critical Data Elements to govern, corresponding Data Quality rules, policy, regulation, roles, Users, data source systems, dashboard/visualization for multiple data domain. Administration and management of Collibra/Alation data catalogue tool, user groups, permissions Configuration of Data profiling and data lineage Work with Data Owners, stewards, and various stakeholders to understand Collibra/Alation Catalogue requirements and configure it in the tool. What youll bring: Bachelor's or Master's degree in Business Analytics, Computer Science, MIS or related field with academic excellence 3+ years of relevant professional experience in delivering small/medium-scale technology solutions Ability to lead project teams, drive end-to-end activities, meet milestones, and provide mentorship/guidance for the team growth Strong understanding of RDBMS concepts, SQL, data warehousing and reporting Experience with big data concepts, data management, data analytics and cloud platforms Proficiency in programming languages like Python Strong analytical and problem-solving skills, includingexpertise in algorithms and data structures Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Location - Bengaluru,Gurugram,Noida,Pune
Posted 4 days ago
2.0 - 5.0 years
4 - 7 Lacs
Pune
Work from Office
about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZSs Platform Development team designs, implements, tests and supports ZSs ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do: As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality products: automated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring: 1-3 years of experience in developing software, ideally building SaaS products and services Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Good hands on to work with AWS services (EC2, EMR, S3, Serverless stack, RDS, Sagemaker, IAM, EKS etc) Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Good to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required
Posted 4 days ago
1.0 - 6.0 years
8 - 13 Lacs
Pune
Work from Office
Azure Data Engineer JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22756 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do: Create and maintain optimal data pipeline architecture. Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for scalability. Design, develop and deploy high volume ETL pipelines to manage complex and near-real time data collection. Develop and optimize SQL queries and stored procedures to meet business requirements. Design, implement, and maintain REST APIs for data interaction between systems. Ensure performance, security, and availability of databases. Handle common database procedures such as upgrade, backup, recovery, migration, etc. Collaborate with other team members and stakeholders. Prepare documentations and specifications. What you'll bring: Bachelors degree in computer science, Information Technology, or related field 1+ years of experience SQL, TSQL, Azure Data Factory or Synapse or relevant ETL technology. Prepare documentations and specifications. Strong analytical skills (impact/risk analysis, root cause analysis, etc.) Proven ability to work in a team environment, creating partnerships across multiple levels. Demonstrated drive for results, with appropriate attention to detail and commitment. Hands-on experience with Azure SQL Database Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted 4 days ago
0.0 - 3.0 years
3 - 5 Lacs
Pune, Gurugram
Work from Office
What youll do : Collaborate with ZS internal teams and client teams to shape and implement high quality technology solutions that address critical business problems Understand and analyze business problems thoroughly, and translate them into technical designs effectively Design and implement technical features using best practices for the specific technology stack being used Assist in the development phase of implementing technology solutions for client engagements, ensuring effective problem-solving Apply appropriate development methodologies (e.g., agile, waterfall, system integrated testing, mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of projects Provide guidance and support to team members in creating comprehensive project implementation plans Work closely with a development team to accurately interpret and implement business requirements What youll bring : Bachelor's or Master's degree in Business Analytics, Computer Science, MIS or related field with academic excellence Proficiency in RDBMS concepts, SQL, and programming languages such as Python Strong analytical and problem-solving skills to convert intricate business requirements into technology solutions Knowledge of algorithms and data structures Additional Skills : 0-3+ years of relevant professional experience in delivering small/medium-scale technology solutions Strong verbal and written communication skills to effectively convey results and issues to internal and client teams Familiarity with Big Data Concepts and Cloud Platforms like AWS, Azure, and Google Cloud Platform
Posted 4 days ago
2.0 - 7.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Devops Engineer : Bangalore Job Description DevOps Engineer_Qilin Lab Bangalore, India Role Role We are seeking an experienced DevOps Engineer to deliver insights from massive-scale data in real time Specifically, were searching for someone who has fresh ideas and a unique viewpoint, and who enjoys collaborating with a cross-functional team to develop real-world solutions and positive user experiences for every of this role : Work with DevOps to run the production environment by monitoring availability and taking a holistic view of system health Build software and systems to manage our Data Platform infrastructure Improve reliability, quality, and time-to-market of our Global Data Platform Measure and optimize system performance and innovate for continual improvement Provide operational support and engineering for a distributed Platform at : Define, publish and defend service-level objectives (SLOs) Partner with data engineers to improve services through rigorous testing and release procedures Participate in system design, Platform management and capacity planning Create sustainable systems and services through automation and automated run-books Proactive approach to identifying problems and seeking areas for improvement Mentor the team in infrastructure best : Bachelors degree in Computer Science or an IT related field, or equivalent practical experience with a proven track record, The following hands-on working knowledge and experience is required : Kubernetes , EC2 , RDS,ELK Stack, Cloud Platforms (AWS, Azure, GCP) preferably AWS, Building & operating clusters Related technologies such as Containers, Helm, Kustomize, Argocd Ability to program (structured and OOP) using at least one high-level language such as Python, Java, Go, etc Agile Methodologies (Scrum, TDD, BDD, etc) Continuous Integration and Continuous Delivery Tools (gitops) Terraform, Unix/Linux environments Experience with several of the following tools/technologies is desirable : Big Data platforms (eg Apache Hadoop and Apache Spark)Streaming Technologies (Kafka, Kinesis, etc) ElasticSearch Service, Mesh Orchestration technologies, e-g , Argo Knowledge of the following is a plus : Security (OWASP, SIEM, etc)Infrastructure testing (Chaos, Load, Security), Github, Microservices architectures, Notice period : Immediate to 15 days Experience : 3 to 5 years Job Type : Full-time Schedule : Day shift Monday to Friday Work Location : On Site Job Type : Payroll Must Have Skills Python 3 Years Intermediate DevOps 3 Years Intermediate AWS 2 Years Intermediate Agile Methodology 3 Years Intermediate Kubernetes 3 Years Intermediate ElasticSearch 3 Years Intermediate (ref:hirist tech) Show
Posted 4 days ago
10.0 - 14.0 years
20 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
Greetings from Infogain! We are having Immediate requirement for Big Data Engineer (Lead) position in Infogain India Pvt ltd. As a Big Data Engineer (Lead), you will be responsible for leading a team of big data engineers. You will work closely with clients and team members to understand their requirements and develop architectures that meet their needs. You will also be responsible for providing technical leadership and guidance to your team. Mode of Hiring-Permanent Skills : (Azure OR AWS) AND Apache Spark OR Hive OR Hadoop AND Spark Streaming OR Apache Flink OR Kafka AND NoSQL AND Shell OR Python. Exp: 10 to 14 years Location: Bangalore/Noida/Gurgaon/Pune/Mumbai/Kochi Notice period- Early joiner Educational Qualification: BE/BTech/MCA/M.tech Working Experience 12-15 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Experience in building streaming data pipeline using Apache Spark Structured Streaming or Apache Flink on Kafka & Delta Lake Knowledge of NOSQL databases. Good to have experience in Cosmos DB, Restful APIs and GraphQL Knowledge of Big data ETL processing tools, Data modelling and Data mapping. Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza), NoSQL / document databases, flat files Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Can share CV @ arti.sharma@infogain.com Total Exp Experience- Relevant Experience in Big data Relevant Exp in AWS OR Azure Cloud- Current CTC- Exp CTC- Current location - Ok for Bangalore location-
Posted 4 days ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Description Retail Business Services (RBS) supports Amazons Retail business growth WW through three core tasks These are (a) Selection, where RBS sources, creates and enrich ASINs to drive GMS growth; (b) Defect Elimination: where RBS resolves inbound supply chain defects and develops root cause fixes to improve free cash flow and (c) supports operational process for WW Retail teams where there is an air gap in the tech stack The tech team in RBS develops automation that leverages Machine/Deep Learning to scale execution of these high complex tasks that currently require human cognitive skills Our solutions ensure that information in Amazon's catalog is complete, correct and, comprehensive enough to give Amazon customers a great shopping experience every time That's where you can help, We are looking for a sharp, experienced Application Engineer (AE) with a diverse skillset and background As an AE, you will work directly with our business teams to solve their support needs with the existing applications and collect requirements and ways to solve highly scalable solutions in collaboration with other technical teams You will play an active role in translating business and functional requirements into concrete deliverables and building scalable systems You will also contribute to maintain the services healthy and robust You will be responsible for implementing, and maintaining the solutions you provide You will work closely with engineers on maintaining multiple products and services, creating process automation scripts , monitoring and handling ad-hoc operational asks, Basic Qualifications 2+ years of software development, or 2+ years of technical support experience Experience troubleshooting and debugging technical systems Experience in Unix Experience scripting in modern program languages Knowledge of Python, PySpark, Big Data and SQL Queries Preferred Qualifications Knowledge of web services, distributed systems, and web application development Experience with REST web services, XML, JSON Our inclusive culture empowers Amazonians to deliver the best results for our customers If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, amazon jobs / content / en / how-we-hire / accommodations for more information If the country/region youre applying in isnt listed, please contact your Recruiting Partner, Company ADCI BLR 14 SEZ Job ID: A3032552 Show
Posted 4 days ago
6.0 - 10.0 years
20 - 27 Lacs
Pune, Chennai
Work from Office
Mandatory - Experience and knowledge in designing, implementing, and managing non-relational data stores (e.g., MongoDB, Cassandra, DynamoDB), focusing on flexible schema design, scalability, and performance optimization for handling large volumes of unstructured or semi-structured data. Mainly client needs No SQL DB, either MongoDB or HBase Data Pipeline Development: Design, develop, test, and deploy robust, high-performance, and scalable ETL/ELT data pipelines using Scala and Apache Spark to ingest, process, and transform large volumes of structured and unstructured data from diverse sources. Big Data Expertise: Leverage expertise in the Hadoop ecosystem (HDFS, Hive, etc.) and distributed computing principles to build efficient and fault-tolerant data solutions. Advanced SQL: Write complex, optimized SQL queries and stored procedures. Performance Optimization: Continuously monitor, analyze, and optimize the performance of data pipelines and data stores. Troubleshoot complex data-related issues, identify bottlenecks, and implement solutions for improved efficiency and reliability. Data Quality & Governance: Implement data quality checks, validation rules, and reconciliation processes to ensure the accuracy, completeness, and consistency of data. Contribute to data governance and security best practices. Automation & CI/CD: Implement automation for data pipeline deployment, monitoring, and alerting using tools like Apache Airflow, Jenkins, or similar CI/CD platforms. Documentation: Create and maintain comprehensive technical documentation for data architectures, pipelines, and processes. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. Minimum 5 years of professional experience in Data Engineering, with a strong focus on big data technologies. Proficiency in Scala for developing big data applications and transformations, especially with Apache Spark. Expert-level proficiency in SQL; ability to write complex queries, optimize performance, and understand database internals. Extensive hands-on experience with Apache Spark (Spark SQL, DataFrames, RDDs) for large-scale data processing and analytics. Mandatory - Experience and knowledge in designing, implementing, and managing non-relational data stores (e.g., MongoDB, Cassandra, DynamoDB), focusing on flexible schema design, scalability, and performance optimization for handling large volumes of unstructured or semi-structured data. Solid understanding of distributed computing concepts and experience with the Hadoop ecosystem (HDFS, Hive). Experience with building and optimizing ETL/ELT processes and data warehousing concepts. Strong understanding of data modeling techniques (e.g., Star Schema, Snowflake Schema). Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an Agile team environment.
Posted 4 days ago
6.0 - 11.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Role Overview: The Enterprise Architect has strong expertise in AWS AI technologies and Anthropic systems. The jobholder designs and implements cutting-edge AI solutions that align with organizational goals, ensuring scalability, security, and innovation. Responsibilities: Architect and implement AI solutions using AWS AI services and Anthropic systems. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Design scalable and secure architectures for AI-driven applications. Optimize AI workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Stay updated with emerging trends in AI and cloud technologies. Troubleshoot and resolve complex technical issues related to AI systems. Document architectural designs and decisions for future reference. Eligibility Criteria: Bachelor's degree in Computer Science, Information Technology, or a related field. Extensive experience with AWS AI services and Anthropic systems. Strong understanding of AI architecture, design, and optimization. Proficiency in programming languages such as Python and Java. Experience with cloud-based AI solutions is a plus. Familiarity with Agile development methodologies. Knowledge of data governance and compliance standards. Excellent problem-solving and analytical skills. Proven leadership and team management abilities. Ability to work in a fast-paced environment and manage multiple priorities.
Posted 4 days ago
4.0 - 7.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Role Overview: At Skyhigh Security, we are building ground-breaking technology to help enterprises enable and accelerate the safe adoption of cloud services. SSE products help the worlds largest organizations unleash the power of the cloud by providing real-time protection for enterprise data and users across all cloud services. The Data Analytics team of our cloud service BU is looking for a capable, enthusiastic Big Data Test Engineer who will be a creative, innovative and results-oriented person willing to go the extra mile in a fast-paced environment. Take ownership of major big data components/services and all backend aspects of the software life cycle in a SaaS environment. Data Analytics team manages big data pipelines and machine learning systems pertaining to our Skyhigh Security Cloud. We are responsible for analysing more than 40 terabytes of data per day, and we inspect close to billion activities of users in real time for threat protection and monitoring. As a member of our engineering team, youll provide technical expertise (architecture, design, development, code reviews, use of modern static analysis tools, unit testing & system integration, automated testing, etc.). The role requires frequent use of ingenuity, creativity and thinking outside-the-box, in order to effectively contribute to our outstanding analytics solution and capabilities. We firmly believe in our values, and it is what makes us tick as one of the successful team within Skyhigh Security. The more these values resonate with you better the chance of you thriving within our environment. You find clarity and make right decisions despite ambiguity You are curious in general and fascinated by how things work in this world You listen well before you respond to others You want to make an impact on the team and the company You are not afraid to speak your mind and willing to put the team ahead of yourself You are humble, and genuinely want to help your team members You can remain calm even in a most stressful situation You will aim for simplicity in whatever you do The successful candidate possesses the excellent interpersonal and communication skills required to partner with other teams across the business to identify opportunities and risks and develop and deliver solutions that support business strategies. This individual will report into the Senior Engineering Manager within the Cloud Business Unit and will be based in Bangalore, India About the role: End to End Software Development and Test, Automate, build, maintenance, and production support of big data pipelines and Hadoop ecosystem Recognize the big picture and take initiative to solve the problem and Automate. Being aware of current big data technology trends & factoring this into current design and implementation. Document Test Plan, Automation Strategy and present it to the stakeholders Identifies, recommends, coordinates, deliver timely knowledge to the globally distributed teams regarding technologies, processes, and tools Proactively identify and communicate roadblocks. About You- Minimum Requirements: Bachelor's degree in Computer Science or equivalent degree. Master'sdegree is a plus Overall 4 to 7 years of experience. Individual contribution as needed and coordinate with other teams Good exposure to test frameworks like JUnit, Test NG, Cucumber and mocking frameworks. Developing application with Java and spring Test, Develop and implement automated tests using Python Experience in any Automation Framework. Hands on experience on Robot Framework will be a plus. Having Big data experience will be a plus. Exposure to Agile development, TDD, and Lean development Experience with AWS CloudFormation, Cloudwatch, SQS, Lambda is a plus.
Posted 4 days ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a D365 Finance & Operations Functional/Technical Consultant at Databuzz, you will be responsible for leveraging your expertise in MS Dynamics D365 F&O to enhance business operations and drive efficiency. With 7-9 years of experience in the field, you will play a key role in supporting various modules such as Production (Manufacturing), Trade & Logistics, and Project Management. Your proficiency in D365 F&O will be crucial as you contribute to the success of minimum 2-3 full lifecycle implementations. Your responsibilities will include providing functional support for D365 F&O application maintenance and operations, as well as utilizing your knowledge in areas such as Purchase, Sales, Inventory, Advance Warehouse, Master Planning, and Production. Experience in these domains will be highly advantageous in delivering impactful solutions and driving continuous improvement. Databuzz is a leading provider of data analytics services, specializing in Data Science, Big Data, Data Engineering, AI & ML, Cloud Infrastructure, and DevOps. As an MNC with operations in the UK and India, we adhere to the highest standards of data security and compliance, being an ISO 27001 & GDPR compliant company. If you are passionate about leveraging your skills in D365 F&O to make a meaningful impact, we encourage you to share your profile with us. Kindly include your Current CTC, Expected CTC, Notice Period/Last Working Day, and Date of Birth in your email to joy.praneeth@databuzzltd.com. Candidates currently serving notice period will be given preference. Join us at Databuzz and be part of a dynamic team that is dedicated to driving innovation and delivering excellence in the field of data analytics. We look forward to welcoming talented professionals like you who can contribute to our success. Warm regards, Joypraneeth Talent Acquisition Lead Databuzz Ltd,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
You will be joining our engineering team in Ahmedabad as a software engineer. Your main responsibility will involve designing and developing Enterprise Software for our Global Fortune 500 clients in Data Analytics, Security, and Cloud segments. Your expertise in Core & Advanced Python with experience in developing REST API using any framework will be crucial for this role. Your responsibilities will include defining, integrating, and upgrading a comprehensive architecture to support Java applications to achieve organization goals. You will provide expertise in the software development life cycle, lead and mentor a small-sized team, ensure code reviews and development best practices are followed, and actively engage in regular client communication. Estimating efforts, identifying risks, providing technical support, and effective people and task management will be key aspects of your role. You must also demonstrate the ability to multitask, re-prioritize responsibilities based on dynamic requirements, and work with minimal supervision. To be successful in this role, you should have at least 2 years of experience in software architecture, system design, and development, along with extensive software development experience in Python. Experience in developing RESTful Web services using any framework, strong Computer Science fundamentals in Object-Oriented Design and Data Structures, and familiarity with Linux programming are essential. Expertise in Big Data, Networking, Storage, or Virtualization is a plus. Working knowledge of Agile Software development methodology, excellent oral and written communication skills, problem-solving abilities, and analytical skills are also required. You must hold a minimum qualification of BE in Computer Science or equivalent.,
Posted 4 days ago
8.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Program Lead in Technology Consulting, you will play a crucial role in developing AI and analytics solutions for our clients. Your primary responsibilities will include focusing on data engineering and application engineering aspects of the solutions, collaborating with internal data science teams for the ML aspects, and ensuring the overall success of the program. Your duties will involve driving problem discovery, scoping, program management, and delivery. You will lead requirements elicitation by leveraging your business acumen and technology understanding, translate solution requirements to technical architects, review solution design for compliance with client requirements, and coordinate program delivery by communicating updates, bottlenecks, risks, and delays. Additionally, you will measure and articulate the value of the solutions developed to facilitate solution adoption by business teams. This role will require a blend of hands-on contribution, customer engagement, and team management. The ideal candidate should have experience in program delivery across various data systems (cloud/on-premise, classical RDBMS/big-data/NoSQL), project planning and tracking in agile and waterfall paradigms, and working with different source systems and their impact on design, functioning, and maintenance of data systems. Experience in handling programs with significant application engineering and development components, exposure to MLOps/DevOps in cloud-based predictive analytics solution deployment, team management, and client stakeholder management is preferred. Candidates with 8-14 years of relevant experience, including hands-on development experience, and a techno-functional background (Engineering + MBA) are encouraged to apply. We are committed to providing equal opportunities to all candidates and invite you to join our team as we strive to build the best AI and advanced analytics team in the world. Our compensation packages are competitive and aligned with industry standards based on your expertise and experience. Apart from the challenging work environment, we offer various perks to our employees: 1. **Latest Technology:** You will have the opportunity to work with cutting-edge technologies like machine learning and artificial intelligence. 2. **Global Exposure:** Our clientele includes leading global brands, providing you with exposure to global markets and international clients. 3. **Learning & Development:** We partner with renowned learning platforms to support your continuous learning and growth. 4. **Growth Mindset:** We foster a growth mindset and believe in continuous learning. There is no pressure to master everything, as we collectively explore the vast potential of data to solve complex problems. 5. **Remote Working:** You will have the flexibility to choose your work location, providing a balanced work-life environment. 6. **Additional Benefits:** We offer health insurance coverage for you and your family, access to a virtual wellness platform, and opportunities to engage with fun and knowledge communities within the organization. If you are passionate about technology, data, and innovation, and are looking to make a significant impact in the field of AI and analytics, we welcome you to apply and be part of our dynamic team.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As an AEP Consultant, you will play a crucial role in leading the implementation of end-to-end customer data solutions. Your expertise with Adobe Experience Platform, data engineering, and customer data strategy will be vital in driving business value through technical proficiency and effective stakeholder engagement. Your responsibilities will include leading discovery sessions to comprehend client data sources, business use cases, and integration needs. You will design and implement technical solutions for data ingestion, transformation, and activation within AEP. Serving as the primary technical expert and solution owner during implementation, you will collaborate with cross-functional teams to deliver scalable data solutions. Your role will involve developing, deploying, and managing real-time and batch data pipelines using Python and cloud-native tools. Configuring AEP components and collaborating with data teams to design data models supporting audience segmentation and personalization will be key responsibilities. Additionally, you will support deployment into production, recommend enhancements to existing AEP frameworks, and ensure solutions are scalable, secure, and compliant with data governance and privacy standards. The required skills and qualifications for this role include: - 5+ years of experience in data engineering, analytics, or related technical roles. - Proven experience implementing Adobe Experience Platform (AEP) in enterprise environments. - Strong programming skills in Python and familiarity with big data technologies. - Hands-on experience with cloud platforms like AWS, Azure, or GCP. - Strong understanding of customer data platforms, audience segmentation, and data privacy. - Ability to translate business requirements into scalable and efficient data solutions. - Excellent problem-solving, communication, and stakeholder management skills. Preferred skills include experience with Real-Time CDP, Adobe Launch, or other Adobe Experience Cloud products, knowledge of tools like Snowflake, Kafka, Databricks, or Spark, and being an Adobe Certified Expert in Adobe Experience Platform (preferred but not mandatory). Expected skills for this role encompass AEP expertise, Python programming, big data and cloud experience, data modeling and analysis, as well as strong communication and leadership abilities.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As part of the Infosys delivery team, your primary role will involve ensuring effective Design, Development, Validation, and Support activities to ensure our clients" satisfaction with high levels of service in the technology domain. You will be responsible for gathering requirements and specifications to thoroughly understand client needs and translating them into system requirements. Your contribution will be crucial in estimating work requirements accurately to provide essential information on project estimations to Technology Leads and Project Managers. Your involvement will be key in building efficient programs/systems and supporting clients in their digital transformation journey. If you are passionate about leveraging technology to drive innovation and support clients in their digital transformation journey, this role is an excellent fit for you.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As an AWS Architect, you will be responsible for owning the architectural design, development, delivery, and maintenance of AI-based solutions. Your primary role will involve leading a team to build innovative solutions on the AWS platform. To excel in this position, you must have hands-on experience in AWS architectural development and a strong understanding of various AWS services such as Analytics, Big data, Databases, Security, Containerization, Application deployment, and CI/CD. Additionally, you should possess basic knowledge of Data analysis and Machine Learning, including data preprocessing, cleaning, modeling, and deployment experience. Requirements engineering and problem-solving skills are essential for this role, along with an AWS Solutions Architect certification. Good communication and presentation skills in English are necessary to effectively interact with stakeholders. A continuous learning attitude is crucial in keeping up with the latest technologies and trends in the field. Experience in automotive data analysis and working with international clients would be beneficial for this position. If you meet these requirements and are passionate about designing cutting-edge solutions on the AWS platform, we encourage you to apply for this exciting opportunity.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The Data Engineer role is crucial in developing high-quality data products to meet the Banks regulatory requirements and drive data-informed decision making. As a Data Engineer, you will lead by example within the team, collaborate closely with stakeholders, and address any obstacles that may arise. Your expertise in data architecture standards, data warehousing, data structures, and business intelligence will be instrumental in contributing to the success of the business in an agile environment. Your responsibilities will include developing and supporting scalable, extensible, and highly available data solutions, ensuring alignment with the broader architectural vision, identifying and mitigating risks in the data supply chain, adhering to and enhancing technical standards, and designing analytical data models. You will be required to have a First Class Degree in Engineering/Technology along with 4-6 years of experience in implementing data-intensive solutions using agile methodologies. Proficiency in relational databases, SQL for data querying, transformation, and manipulation, as well as experience in modelling data for analytical purposes, will be expected. In terms of technical skills, you must have hands-on experience in building data pipelines with proficiency in data integration platforms like Ab Initio, Apache Spark, Talend, or Informatica. Experience with big data platforms such as Hadoop, Hive, or Snowflake, understanding of Data Warehousing concepts, relational and NoSQL database design, data modeling techniques, and proficiency in programming languages like Python, Java, or Scala are essential. Exposure to DevOps concepts, CI/CD platforms, version control, and automated quality control management will be beneficial. Additional technical skills that would be valuable include experience with Ab Initio for developing Co>Op graphs, knowledge of public cloud data platforms like S3, Snowflake, Redshift, and BigQuery, understanding of data quality controls, familiarity with containerization platforms like Docker and Kubernetes, exposure to various file formats such as Avro, Parquet, Protobuf, and others like basics of job schedulers and entitlement management. Certification in any of the mentioned topics would be advantageous for this role. The position falls under the Technology job family group, specifically in Digital Software Engineering, and is a full-time role. For further details on complementary skills, you can refer to the above requirements or reach out to the recruiter for more information.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Cloud Platform Engineer at Healthcare Intelligence within Providence, you will be responsible for the Azure Administration to ensure the availability and operational efficiency of the cloud platform. Your role will involve managing critical applications hosted on Azure Infrastructure, including AzureSQL, ADF, AKS, Azure VM, and more. Your primary focus will be on maintaining platform availability, reliability, and performance of the Azure Cloud Infrastructure. In the position of Sr. Cloud Engineer, you will play a crucial role in Azure Infrastructure administration, ensuring a highly available and stable environment with sustained performance. Your responsibilities will include working on various Azure services, implementing automation using IaaC approach, troubleshooting production issues, managing Azure resource utilization, and utilizing Telemetry solutions for monitoring and alerting. Your day-to-day activities will involve monitoring and addressing incidents and user requests related to Azure Infrastructure, collaborating with product teams on application architecture and performance issues, working with Enterprise Infrastructure and Security teams on policy implementation, and engaging with Microsoft support on severity issues. To be successful in this role, you should have a Bachelor's degree in Engineering, a minimum of 5 years of experience in Cloud Infrastructure administration with at least 3 years in Azure administration, strong knowledge of Azure Administration concepts, experience with Infrastructure as Code deployment, Azure DevOps, CI/CD, system reliability, Azure Databricks, Azure AI Services, and more. Additionally, you should be proficient in incident management, source code control systems, agile methodologies, and have excellent communication and collaborative skills. Join our team of professionals who are dedicated to improving patient and caregiver experience through innovative technologies and drive a lasting social impact. If you are a pioneering and compassionate individual who is ready to plan for the future of healthcare, we look forward to working with you in re-imagining the future of care with cutting-edge technologies.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
Genpact is a global professional services and solutions firm with a team of over 125,000 professionals in more than 30 countries. Driven by curiosity, agility, and the desire to create lasting value for clients, we serve leading enterprises worldwide, including the Fortune Global 500. Our purpose is the relentless pursuit of a world that works better for people, and we achieve this through our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant, Research Data Scientist. We are looking for candidates with relevant experience in Text Mining/Natural Language Processing (NLP) tools, Data sciences, Big Data, and algorithms. The ideal candidate should have full cycle experience in at least one large-scale Text Mining/NLP project, including creating a business use case, Text Analytics assessment/roadmap, technology & analytic solutioning, implementation, and change management. Experience in Hadoop, including development in the map-reduce framework, is also desirable. The Text Mining Scientist (TMS) will play a crucial role in bridging enterprise database teams and business/functional resources, translating business needs into techno-analytic problems, and working with database teams to deliver large-scale text analytic solutions. Responsibilities: - Develop transformative AI/ML solutions to address clients" business requirements - Manage project delivery involving data pre-processing, model training and evaluation, and parameter tuning - Manage stakeholder/customer expectations and project documentation - Research cutting-edge developments in AI/ML with NLP/NLU applications in various industries - Design and develop solution algorithms within tight timelines - Interact with clients to collect and synthesize requirements for effective analytics/text mining roadmap - Work with digital development teams to integrate algorithms into production applications - Conduct applied research on text analytics and machine learning projects, file patents, and publish papers Qualifications: Minimum Qualifications/Skills: - MS in Computer Science, Information Systems, or Computer Engineering - Relevant experience in Text Mining/Natural Language Processing (NLP) tools, Data sciences, Big Data, and algorithms Technology: - Open Source Text Mining paradigms (NLTK, OpenNLP, OpenCalais, StanfordNLP, GATE, UIMA, Lucene) and cloud-based NLU tools (DialogFlow, MS LUIS) - Statistical Toolkits (R, Weka, S-Plus, Matlab, SAS-Text Miner) - Strong Core Java experience, programming in the Hadoop ecosystem, and distributed computing concepts - Proficiency in Python/R programming; Java programming skills are a plus Methodology: - Solutioning & Consulting experience in verticals like BFSI, CPG, with text analytics experience on large structured and unstructured data - Knowledge of AI Methodologies (ML, DL, NLP, Neural Networks, Information Retrieval, NLG, NLU) - Familiarity with Natural Language Processing & Statistics concepts, especially in their application - Ability to conduct client research to enhance analytics agenda Preferred Qualifications/Skills: Technology: - Expertise in NLP, NLU, and Machine learning/Deep learning methods - UI development paradigms for Text Mining Insights Visualization - Experience with Linux, Windows, GPU, Spark, Scala, and deep learning frameworks Methodology: - Social Network modeling paradigms, tools & techniques - Text Analytics using NLP tools like Support Vector Machines and Social Network Analysis - Previous experience with Text analytics implementations using open source packages or SAS-Text Miner - Strong prioritization, consultative mindset, and time management skills Job Details: - Job Title: Principal Consultant - Primary Location: India-Gurugram - Schedule: Full-time - Education Level: Master's/Equivalent - Job Posting Date: Oct 4, 2024, 12:27:03 PM - Unposting Date: Ongoing - Master Skills List: Digital - Job Category: Full Time,
Posted 4 days ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
Genpact is a global professional services and solutions firm committed to delivering outcomes that shape the future. With over 125,000 employees in more than 30 countries, we are fueled by curiosity, agility, and a drive to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, and we serve leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are looking for a Senior Principal Consultant, Senior Data Architect to join our team! Responsibilities: - Manage programs and ensure the integration and implementation of program elements according to the agreed schedule with quality deliverables. - Lead and guide the development team on ETL architecture. - Collaborate closely with customer architects, business, and technical stakeholders to build trust and establish credibility. - Provide insights on customer direction to guide them towards optimal outcomes. - Address client technical issues, articulate understanding, and offer solutions in AWS Cloud and Big Data Domains. - Build the infrastructure for efficient extraction, transformation, and loading of data from various sources using SQL and AWS "Big data" technologies. - Analyze the existing technology landscape and current application workloads. - Design and architect solutions with scalability, operational completion, and elasticity in mind. - Hands-on experience in building Java applications. - Optimize Spark applications running on Hadoop EMR clusters for performance. - Develop architecture blueprints, detailed documentation, and bill of materials, including required Cloud Services. - Collaborate with various teams to drive business growth and customer success. - Lead strategic pre-sales engagements with larger, more complex customers. - Engage and communicate proactively to align internal and external customer expectations. - Drive key strategic opportunities with top customers in partnership with sales and delivery teams. - Maintain customer relationships through proactive pre-sales engagements. - Lead workshops to identify customer needs and challenges. - Create and present services responses, proposals, and roadmaps to meet customer objectives. - Lead presales, solutioning, estimations, and POC preparation. - Mentor team members and build reusable solution frameworks and components. - Head complex ETL requirements, design, and implementation. - Ensure client satisfaction with product by developing architectural requirements. - Develop project plans, identify resource requirements, and assure code quality. - Shape and enhance ETL architecture, recommend improvements, and resolve design issues. Qualifications: Minimum qualifications: - Engineering Degree or equivalent. - Relevant work experience. - Hands-on experience in ETL/BI tools like Talend, SSIS, Abinitio, Informatica. - Experience with Cloud Technologies such as AWS, Databricks, Airflow. Preferred Skills: - Excellent written and verbal communication skills. - Strong analytical and problem-solving skills. - Experience in consulting roles within a technology company. - Ability to articulate technical solutions clearly to different stakeholders. - Team player with the ability to collaborate effectively. - Willingness to travel occasionally. If you are looking to join a dynamic team and contribute to innovative solutions in data architecture, this role might be the perfect fit for you. Join us at Genpact and be part of shaping the future of professional services and solutions! Job Details: - Job Title: Senior Principal Consultant, Senior Data Architect - Primary Location: India-Mumbai - Schedule: Full-time - Education Level: Bachelor's / Graduation / Equivalent - Job Posting Date: Oct 7, 2024, 7:51:53 AM - Master Skills List: Digital - Job Category: Full Time,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
We are a brand new Automotive Technology Start-up dedicated to developing driver safety systems tailored for Indian conditions. Our focus is on equipping riders and drivers with advanced rider assistance systems and affordable obstacle-avoidance automotive systems. Leveraging cutting-edge technologies such as Computer Vision, AI/ML, Big Data, Sensor Fusion, Embedded Systems, and IOT, we aim to create smart assistance systems that enhance safety on the road. Our current suite of products includes innovative solutions like the Front Collision Warning System (FCWS), Sleep Driver Alert System, Driver Monitoring System, and Driver Evaluation and Authentication. These products are designed to address critical safety issues and improve the overall driving experience. The Start-up is strongly supported by a $500Mn Group, a prominent automotive systems manufacturer serving global OEMs. With 29 manufacturing facilities spread across 7 states in India and a workforce exceeding 15,000 employees, the Group brings valuable expertise and resources to our venture. We are looking for a skilled AI/ML Engineer to join our early-stage start-up and contribute to the development of our rider safety product. As an AI/ML Engineer, you will be responsible for designing, developing, and integrating computer vision algorithms that play a crucial role in capturing and analyzing the surroundings to provide effective alerts for riders. In this role, you will: - Develop state-of-the-art CNN based computer vision object detectors and classifiers for real-time detection of road objects - Design and implement data ingestion, annotation, and model training pipelines to handle large volumes of video data and images - Create model visualizations, conduct hyperparameter tuning, and leverage data-driven insights to enhance model performance - Optimize models for efficient inference times and deploy them on low-power embedded IoT ARM CPUs - Establish CI/CD tests to evaluate multiple models on the test set effectively The ideal candidate should possess: - A BS or MS degree in Computer Science or Engineering from reputable educational institutions - Experience in building deep-learning object detectors for Computer Vision Applications - Proficiency in popular CNN architectures such as AlexNet, Google Lenet, MobileNet, Darknet, YOLO, SSD, and Resnet - Hands-on experience with libraries and frameworks like Caffe, Tensorflow, Keras, PyTorch, OpenCV, ARM Compute Library, and OpenCL - Knowledge of Transfer Learning and training models with limited data - Strong programming skills in Modern C++14 or above, Python, along with solid understanding of data structures and algorithms - Familiarity with working on small embedded computers, hardware peripherals, Docker Containers, and Linux-flavored operating systems To excel in this role, you can stand out by having: - Prior experience in product development within an early-stage start-up environment - Expertise in deploying scalable ML models for Android/IOS platforms - Noteworthy contributions to Open Source projects or achievements in coding Hackathons - Passion for Raspberry Pi and ARM CPUs - Keen interest in the field of Autonomous Driving Join us in our mission to revolutionize driver safety through innovative technology solutions tailored for Indian roads.,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. The financial services practice at EY offers integrated advisory services to financial institutions and other capital markets participants. Within EY's Advisory Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for Senior and Manager Big Data Experts with expertise in the Financial Services domain and hands-on experience with the Big Data ecosystem. Expertise in Data engineering, including design and development of big data platforms. Deep understanding of modern data processing technology stacks such as Spark, HBase, and other Hadoop ecosystem technologies. Development using SCALA is a plus. Deep understanding of streaming data architectures and technologies for real-time and low-latency data processing. Experience with agile development methods, including core values, guiding principles, and key agile practices. Understanding of the theory and application of Continuous Integration/Delivery. Experience with NoSQL technologies and a passion for software craftsmanship. Experience in the Financial industry is a plus. Nice to have skills include understanding and familiarity with all Hadoop Ecosystem components, Hadoop Administrative Fundamentals, experience working with NoSQL in data stores like HBase, Cassandra, MongoDB, HDFS, Hive, Impala, schedulers like Airflow, Nifi, experience in Hadoop clustering, and Auto scaling. Developing standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Defining and developing client-specific best practices around data management within a Hadoop environment on Azure cloud. To qualify for the role, you must have a BE/BTech/MCA/MBA degree, a minimum of 3 years hands-on experience in one or more relevant areas, and a total of 6-10 years of industry experience. Ideally, you'll also have experience in Banking and Capital Markets domains. Skills and attributes for success include using an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates, strong communication, presentation and team building skills, experience in producing high-quality reports, papers, and presentations, and experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment, an opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide, and opportunities to work with EY Advisory practices globally with leading businesses across a range of industries. Working at EY offers inspiring and meaningful projects, education and coaching alongside practical experience for personal development, support, coaching, and feedback from engaging colleagues, opportunities to develop new skills and progress your career, freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Solution Architect, you will be responsible for developing effective and optimized IT solutions that cater to customer needs and constraints. Your role includes ensuring the robustness of designs across various parameters such as performance, security, usability, and scalability. You should possess a strong spirit of innovation, be a self-starter, and have the ability to devise solutions for complex problems. Proactively developing reusable frameworks and assets to enhance the competitive advantage of the solution is a key aspect of this role. You will be expected to maintain a comprehensive understanding of the customer enterprise systems landscape, its dependencies, organizational goals, and technologies. Analyzing current processes and practices to suggest and drive improvements will be part of your responsibilities. Collaboration with stakeholders is crucial for achieving objectives, and you must continuously engage with them. Your role also involves analyzing industry standard products and technologies to make suitable recommendations, often through proofs of concept or demonstrations. As a subject matter expert, you will respond to sales leads, bid requests, and new opportunities from customers. Presenting designs, solutions, and applications to customers and influencing them to win bids and opportunities are essential parts of this role. Sharing experiences, knowledge, and best practices within the organization is also expected from you. Job Requirements: - Minimum 8 years of IT experience - Proficiency in designing and architecting large-scale distributed systems, preferably in the Airline, Travel, or Hospitality domain - Sound knowledge of architecture and design patterns - Deep understanding of the latest technologies and tools, including big data analytics, cloud, and mobility - Excellent communication and customer interfacing skills with exposure to requirement management, sizing, and solutioning - Ability to understand, interpret, and influence client needs, requirements, expectations, and behavior - Experience in the full project/product development cycle and supporting applications in a production environment Required Skills and Experience: - Industry: IT/Computers-Software - Role: Solution Architect - Key Skills: Solution Designing, Solution Architect, Java, J2EE, Big Data, Airline, Aviation, Travel, Logistics, Architecture and Designing, OOAD - Education: B.Sc/B.Com/M.Sc/MCA/B.E/B.Tech If you are looking to join a dynamic team as a Solution Architect with extensive experience and expertise in developing innovative IT solutions, this opportunity might be the right fit for you. For further details or to apply, please contact jobs@augustainfotech.com.,
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
We are seeking a confident, creative, and curious team player to join us as a Business Analyst in Bangalore, working with one of the largest retail companies in the UK. As part of the team, you will support multiple projects for our client in an Agile environment. Your responsibilities will include working closely with Product Managers to deliver outcomes for the product area, owning source system discovery and data analysis for accuracy and completeness, collaborating with other technology teams to understand tech initiatives, and gathering and documenting information on systems. You will also build domain knowledge to understand business requirements, document metadata for new and existing data, assist in maintaining backlogs and user stories, validate data to ensure it meets business requirements, and support business teams during UAT and post-production phases. Additionally, you will support the Engineering team in production landscape design, document all aspects of data products, and perform data lineage analysis. The ideal candidate will have at least 8 years of relevant Business Analyst experience, with expertise in Big Data Hadoop applications, SQL queries, and Agile methodology. Proficiency in English at the C1 Advanced level is required. If you are a proactive individual with strong analytical skills, experience in business analysis, and a passion for working in a dynamic team environment, we would like to hear from you.,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At Goldman Sachs, as an Engineer, you will not only create things but also make the impossible possible. Your role will involve connecting people and capital with innovative ideas to bring about significant changes in the world. You will be tasked with solving complex engineering challenges for our clients, working within engineering teams that develop highly scalable software and systems, design low latency infrastructure solutions, proactively defend against cyber threats, and utilize machine learning in conjunction with financial engineering to transform data into actionable insights. By creating new ventures, revolutionizing the financial sector, and embracing a world of endless opportunities that move at the pace of the markets, you will play a crucial role in driving our business forward. Engineering lies at the heart of our business, encompassing our Technology Division and global strategists groups. It is an environment that demands innovative strategic thinking and immediate, practical solutions. If you are eager to explore the boundaries of digital possibilities and make a real impact, this is the place to begin your journey. We are seeking Engineers at Goldman Sachs who are visionaries and solution finders, specializing in risk management, big data, mobile technologies, and more. We value individuals who can think creatively, collaborate effectively, adapt to changes, and excel in a fast-paced global setting. Goldman Sachs is dedicated to leveraging the expertise, resources, and creativity of our people to drive growth for our clients, shareholders, and the communities we serve. Established in 1869, we are a prominent global investment banking, securities, and investment management firm with headquarters in New York and a global presence. We strongly believe that embracing diversity and fostering inclusion leads to enhanced performance. We are committed to promoting diversity and inclusion both within our organization and beyond by providing numerous opportunities for personal and professional growth to every individual. From comprehensive training programs, firmwide networks, and wellness initiatives to financial planning resources and mindfulness programs, we offer a supportive environment that enables our people to thrive. To learn more about our culture, benefits, and talented workforce, visit GS.com/careers. Goldman Sachs is dedicated to accommodating candidates with special needs or disabilities throughout our recruitment process. For more information on our commitment to providing reasonable accommodations, please visit: https://www.goldmansachs.com/careers/footer/disability-statement.html The Goldman Sachs Group, Inc. 2023. All rights reserved.,
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough