Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 5+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL - Experience managing a data or BI team - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience hiring, developing and promoting engineering talent - Experience communicating to senior management and customers verbally and in writing We are seeking an ambitious Data Engineering Manager to join our Metrics and Data Platform team. The Metrics and Data Platform team plays a critical role in enabling Amazon Music’s business decisions and data-driven software development by collecting and providing behavioral and operational metrics to our internal teams. We maintain a scalable and robust data platform to support Amazon Music’s rapid growth, and collaborate closely with data producers and data consumers to accelerate innovation using data. As a Data Engineering Manager, you will manage a team of talented Data Engineers. Your team collects billions of events a day, manages petabyte-scale datasets on Redshift and S3, and develops data pipelines with Spark, SQL, EMR, and Airflow. You will collaborate with product and technical stakeholders to solve challenging data modeling, data availability, data quality, and data governance problems. At Amazon Music, engineering managers are the primary drivers of their team’s roadmap, priorities, and goals. You will be deeply involved in your team’s execution, helping to remove obstacles and accelerate progress. A successful candidate will be customer obsessed, highly analytical and detail oriented, able to work effectively in a data-heavy organization, and adept at leading across multiple different complex workstreams at once. Key job responsibilities - Hiring, motivating, mentoring, and growing a high-performing engineering team - Owning and managing big data pipelines, Amazon Music’s foundational datasets, and the quality and operational performance of the datasets - Collaborating with cross-functional teams and customers, including business analysts, marketing, product managers, technical program managers, and software engineers/managers - Defining and managing your team’s roadmap, priorities, and goals in partnership with Product, stakeholders, and leaders - Ensuring timely execution of team priorities and goals by proactively identifying risks and removing blockers - Recognizing and recommending process and engineering improvements that reduce failures and improve efficiency - Clearly communicating business updates, verbally and in writing, to both technical and non-technical stakeholders, peers, and leadership - Effectively influencing other team’s priorities and managing escalations - Owning and improving business and operational metrics of your team's software - Ensuring team compliance with policies (e.g., information security, data handling, service level agreements) - Identifying ways to leverage GenAI to reduce operational overhead and improve execution velocity - Introducing ideas to evolve and modernize our data model to address customer pain points and improve query performance About the team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators. From personalized music playlists to exclusive podcasts, concert livestreams to artist merch, Amazon Music is innovating at some of the most exciting intersections of music and culture. We offer experiences that serve all listeners with our different tiers of service: Prime members get access to all the music in shuffle mode, and top ad-free podcasts, included with their membership; customers can upgrade to Amazon Music Unlimited for unlimited, on-demand access to 100 million songs, including millions in HD, Ultra HD, and spatial audio; and anyone can listen for free by downloading the Amazon Music app or via Alexa-enabled devices. Join us for the opportunity to influence how Amazon Music engages fans, artists, and creators on a global scale. Learn more at https://www.amazon.com/music. Experience with AWS Tools and Technologies (Redshift, S3, EC2) Experience in processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu
Work from Office
Job Description Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 1 month ago
5 - 8 years
0 Lacs
Pune, Maharashtra, India
Entity: Technology Job Family Group: IT&S Group Job Description: Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 13- 15 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Commercial Acumen, Communication, Data Analysis, Data cleansing and transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transform and load, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 month ago
5 - 10 years
1 - 6 Lacs
Ahmedabad, Bengaluru, Kolkata
Work from Office
Job description, Hiring for ETL Informatica developer with experience range 5 years & above Mandatory Skills: ETL Informatica developer, Yellow brick/Netezza, unix shell scripting Education: BE/B.Tech/MCA/M.Tech/MSc./MS Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!
Posted 2 months ago
4 - 6 years
5 - 10 Lacs
Bengaluru
Work from Office
At Sogeti, we believe the best is inside every one of us. Whether you are early in your career or at the top of your game, well encourage you to fulfill your potentialto be better. Through our shared passion for technology, our entrepreneurial culture , and our focus on continuous learning, well provide everything you need to doyour best work and become the best you can be. About The Role Hands on experience in Oracle DBA About The Role - Grade Specific Hands on experience in Oracle DBA Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a localpartner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, andsmarter. By combining its agility and speed of implementation through a DevOps approach, Sogeti delivers innovative solutions in quality engineering, cloud andapplication development, all driven by AI, data and automation.
Posted 2 months ago
3 - 5 years
4 - 8 Lacs
Bengaluru
Work from Office
Job ID/Reference Code INFSYS-NAUKRI-210748 Work Experience 3-5 Job Title Netezza Developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Netezza Developer Preferred Skills: Technology->Data Management - Data Store->Netezza Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
4 - 7 years
6 - 9 Lacs
Mumbai
Work from Office
Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 2 months ago
7 - 11 years
9 - 13 Lacs
Uttar Pradesh
Work from Office
Location: Hyd/Chennai/Bglore Job Description: Informatica Dev: Having 7+yrs of relevant experience with strong in DW fundaments and Data Modelling concepts and Informatica suit Experience in Informatica PowerCenter, Workflow Manager and Workflow Designer Good knowledge on Data modelling concepts Good knowledge and experience in IICS Good knowledge and experience in Python Experience in Workflow Monitor, Repository Manager. Experience in Databases SQL Server/Oracle. Good knowledge on scheduling engines like Control M/ASG Zena Any experience on Netezza would be added advantage. Good in Performance Tuning related activities Good Analytical and trouble shooting skills Effective communication Skills(oral and written) Should have experience of 2 3 BI life cycle implementations
Posted 2 months ago
4 - 7 years
7 - 11 Lacs
Uttar Pradesh
Work from Office
LocationHyd/Chn/Bglore About The Role Informatica Dev Having 8+yrs of relevant experience with strong in DW fundaments and Data Modelling concepts and Informatica suit Experience in Informatica PowerCenter, Workflow Manager and Workflow Designer Good knowledge on Data modelling concepts Good knowledge and experience in IICS Good knowledge and experience in Python Experience in Workflow Monitor, Repository Manager. Experience in Databases SQL Server/Oracle. Good knowledge on scheduling engines like Control M/ASG Zena Any experience on Netezza would be added advantage. Good in Performance Tuning related activities Good Analytical and trouble shooting skills Effective communication Skills(oral and written) Should have experience of 2 3 BI life cycle implementations
Posted 2 months ago
3 - 5 years
4 - 9 Lacs
Bengaluru
Work from Office
Job Title MicroStrategy Developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Business Intelligence - Reporting->Microstrategy Preferred Skills: Technology->Business Intelligence - Reporting->Microstrategy Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
6 - 11 years
8 - 14 Lacs
Pune
Work from Office
At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 2 months ago
3 - 8 years
8 - 13 Lacs
Hyderabad
Hybrid
ETL and Datawarehouse Testing AZURE cloud SQL, Oracle, Netezza and Unix Analyze STTM mapping document and design test cases for validating source to target DB data transformation Provide detailed level effort estimates based on project data requirements. Understand E2E client data landscape and perform TDM assessments.
Posted 2 months ago
8 - 12 years
30 - 35 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Informatica Dev: Having 8+yrs of relevant experience with strong in DW fundaments and Data Modelling concepts and Informatica suit Experience in Informatica PowerCenter, Workflow Manager and Workflow Designer Good knowledge on Data modelling concepts Good knowledge and experience in IICS Good knowledge and experience in Python Experience in Workflow Monitor, Repository Manager. Experience in Databases SQL Server/Oracle. Good knowledge on scheduling engines like Control M/ASG Zena Any experience on Netezza would be added advantage. Good in Performance Tuning related activities Good Analytical and trouble shooting skills Effective communication Skills(oral and written) Should have experience of 2 3 BI life cycle implementations
Posted 3 months ago
2 - 5 years
10 - 14 Lacs
Hyderabad
Work from Office
Having 8+yrs of relevant experience with strong in DW fundaments and Data Modelling concepts and Informatica suit Experience in Informatica PowerCenter, Workflow Manager and Workflow Designer Good knowledge on Data modelling concepts Good knowledge and experience in IICS Good knowledge and experience in Python Experience in Workflow Monitor, Repository Manager. Experience in Databases SQL Server/Oracle. Good knowledge on scheduling engines like Control M/ASG Zena Any experience on Netezza would be added advantage. Good in Performance Tuning related activities Good Analytical and trouble shooting skills Effective communication Skills(oral and written) Should have experience of 2 3 BI life cycle implementations
Posted 3 months ago
2 - 5 years
11 - 15 Lacs
Bengaluru
Work from Office
Minimum qualifications: Bachelor's degree in Computer Science, or equivalent practical experience 15 years of customer-facing experience designing and deploying distributed data processing systems with one or more technologies Experience with SQL data bases (e g , PostgreSQL, MySQL, Oracle) and NoSQL data bases (e g , Mongo, Cassandra, DynamoDB, etc ) Experience with different types of data modeling techniques and methodologies for traditional Online Analytical Processing or Online Transaction Processing (OLAP/OLTP) databases and modern data warehouses Preferred qualifications: Certification in Cloud 8 years of experience demonstrating technical client service Experience reading software code in one or more languages such as Java, Python, NodeJS, Golang, Javascript Experience devising migration approach, and migrating on premise data processing systems to Cloud Experience designing and deploying large-scale distributed data processing systems with one or more technologies: Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, Flink, Kafka, Druid, Hive, HBase, Vertica, Netezza, Teradata, Tableau, or MicroStrategy Knowledge of building and operationalizing data pipelines About The Job The Google Cloud Platform team helps customers transform and build what's next for their business ? all with technology built in the cloud Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware Our teams are dedicated to helping our customers ? developers, small and large businesses, educational institutions and government agencies ? see the benefits of our technology come to life As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners In this role, you will work with customers on critical projects to transform their business with data You will provide consulting, solution design, and technical program management capabilities to customer engagements while directing customer executives and technical stakeholders on project related decisions You will serve as a liaison between our customers and product teams to drive product excellence and adoption In addition, you will also work with Google partners currently servicing accounts to manage programs, deliver consulting services, and provide technical guidance Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems Responsibilities Work with customer technical leads, client executives, and partners to manage and deliver successful implementations of cloud solutions becoming a trusted advisor to decision makers throughout the engagement Work with internal specialists, product and engineering teams to package best practices and lessons learned into thought leadership, methodologies, and published assets Interact with business, partners, and customer technical stakeholders to manage project scope, priorities, deliverables, risks/issues, and timelines for successful client outcomes Propose solution architectures and manage the deployment of cloud based databases, big data, and analytics solutions according to customer requirements and implement best practices Travel up to 40% of the time for client engagements as needed Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form
Posted 3 months ago
2 - 5 years
4 - 8 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
4 - 6 years
6 - 8 Lacs
Hyderabad
Work from Office
Jr Req-Informatica Developer- 4-6 Yrs-Hyd-Ravi Kishore -mailto:ravikishore.tellapuram@tcs.com- TCS- C2H- 900000
Posted 3 months ago
8 - 13 years
10 - 15 Lacs
Chennai
Work from Office
Experience in Finance (Mgmt reporting, Risk Reporting, Capital and Forecasting areas, Risk or Finance Models) area Exposure of working in a financial institution with business, technology and enterprise stakeholders Experience of working with Business users for Requirements Elicitiation and Data Requirements. Understanding of Data Modeling concepts Requirements Gathering in the context of mapping business requirements to technology sources, interactions with source systems and business users Working in a SaFE Agile or Product oriented environment with focus on functional mapping, TSDs, RTMs and Estimation techniques Strong SQL knowledge, involving complex joins and analytical functions. Good understanding of Data Flow, Data Model, and database applications working knowledge of Databases like Oracle and Netezza Strong Comms and Documentation skills Working with users who are predominantly based in EST timezone Qualification Work Mode - Hybrid Work Timings - 2PM to 11 PM
Posted 3 months ago
14 - 18 years
16 - 25 Lacs
Chennai, Bengaluru, Noida
Hybrid
Job Description - 12+ Years of experience in managing delivery of Data Warehouse Projects (Development & Modernization/Migration). Strong Delivery background with experience in managing large complex Data Warehouse engagements. Good to have experience on Snowflake, Matillion, DBT, Netezza/DataStage and Oracle. Healthcare Payer Industry experience Extensive experience in Program/Project Management, Iterative, Waterfall and Agile Methodologies. Ability to track and manage complex program budgets Experience in managing the delivery of complex programs to meet the needs and the required timelines set for the defined programs. Communicate program review results to various stakeholders. Experience in building the team, providing guidance, and education as needed to ensure the success of priority programs and promote cross-training within the department. Experience in developing and managing an integrated program plans that incorporate both technical and business deliverables. Verify that critical decision gates are well defined, communicated and monitored for executive approval throughout the program. Verify that work supports the corporate strategic direction. Review resulting vendor proposals and estimates to ensure they satisfy both our functional requirements and technology strategies. Project management methodologies, processes, and tools. Knowledge of Project Development Life Cycle Establish and maintain strong working relationships with various stakeholders including team members, IT resources, resources in other areas of the business and upper management Ability to track and manage complex program budgets Strong business acumen and political savvy Ability to collaborate while dealing with complex situations Ability to think creatively and to drive innovation Ability to motivate, lead and inspire a diverse group to a common goal/solution with multiple stakeholders Ability to convert business strategy into action oriented objectives and measurable results Strong negotiating, influencing, and consensus-building skills Ability to mentor, coach and provide guidance to others Responsibilities Responsible for the end to end delivery of the Application Development and Support services for the client Coordinate with Enterprise Program Management Office to execute programs following defined standards and governance structure to ensure alignment to the approved project development life cycle (PDLC). Interface regularly with key senior business leaders to enable a smooth transition from strategy development to program identification and execution. Facilitate meetings with task groups or functional areas as required for EPMO supported initiatives and/or to resolve issues. Proactively engage other members of the organization with specific subject knowledge to resolve issues or provide assistance. Lead post implementation review of major initiatives to provide lessons learned and continuous improvement. Develop accurate and timely summary report for executive management that provide consolidated, clear, and concise assessments of strategic initiatives implementation status. Collaborate with business owners to develop divisional business plans that support the overall strategic direction. Supports budget allocation process through ongoing financial tracking reports. Develop & maintain service plans considering the customer requirements. Track and monitor to ensure the adherence to SLA/KPIs Identify opportunities for improvement to service delivery process. Address service delivery issues/escalations/complaints. First point of escalation for customer escalations Oversee shift management for various tracks. Responsible for publishing production support reports & metrics Regards, Sanjay Kumar
Posted 3 months ago
5 - 9 years
1 - 6 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Job description, Hiring for ETL Informatica developer with experience range 5 years & above Mandatory Skills: Network Engineer-F5 Load balancer Education: BE/B.Tech/MCA/M.Tech/MSc./MS Responsibilities Responsible for all activities related to the development, implementation, administration, and support of ETL processes for large scale data warehouses using Informatica Power Centre Having Good SQL knowledge, worked on Netezza & Yellow brick would be good Having Basic UNIX shell scripting knowledge In-depth understanding of fundamental Data-Warehousing concepts such as Dimensional Modelling, Star and Snowflake Schemas, Data marts, Security and deployment, FACT and Dimensional tables, Logical and Physical data modelling. Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes. Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. Extensive experience in ETL design, development and maintenance using SQL Server SQL, PL/SQL, SQL Loader Well versed in developing the complex SQL queries, unions and multiple tables joins and experience with Views. Experience in database programming in PL/SQL (Stored Procedures, Triggers and Packages). Well versed in UNIX shell scripting.
Posted 3 months ago
2 - 5 years
4 - 8 Lacs
Pune
Work from Office
Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
2 - 6 years
4 - 8 Lacs
Pune
Work from Office
Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At Broadridge, we've built a culture where the highest goal is to empower others to accomplish more. If you’re passionate about developing your career, while helping others along the way, come join the Broadridge team. 8+ Years designing, developing, and administering IBM Cognos 11.1.x application. Cognos 11.1.x upgrade experience required. Installing hot fixes / service packs to the existing version of Cognos Analytics. Experience with Motio CI integration with Cognos Analytics. Knowledge on Cognos SDK and Cognos life cycle manager is a plus. Hands on experience on granular level Cognos security customization and installing any third partly tools. Hands on experience in Cognos Framework manager install and Configuration. Experience with Publishing packages and customize the package acccess as per requirement. Knowledge on Cognos TM1 is Plus. Experience with Cognos/Tableau installation and configuration in AWS Responsible in troubleshooting, resolving Cognos Analytics/tableau issues, open service requests with Cognos vendor, work with different teams providing recommendations, driving standards, plan and execute effective transition on development and production operations. Deployment of Cognos in a clustered environment and performing upgrades Implement and document best practices for a Cognos Environment. Experience in Windows/Linux based operating system environment and well versed in Linux OS commands. Experience should include maintenance and support activities, performance monitoring and tuning, upgrading versions, software configuration, business continuity and disaster recovery planning, and general IT processes such as Change Management, Configuration Management, Problem Resolution, and Incident Tracking required. Ability to cross-train team. Implementation of proactive Cognos environment health checks. Hands on Cognos user groups, security, and user entitlement administration Experience with Cognos User LDAP/Active Directory Integration /Synchronization preferred Experience with IIS 7.5 or higher is plus. Integrate Cognos with SharePoint portal/ team is a plus. Ability to provide 24 by 7 production support for Cognos in an on- rotation with excellent communication skills required Any other BI tool experience such as Tableau/Jaspersoft/Crystal is a plus Experience with industry BI/Reporting toolsets including Tableau, Jaspersoft, Cognos, Power BI, and Crystal. Tableau 2022.1.x upgrade experience required. Knowledge on Jasper report server upgrade 6.2 to 8.1 version is plus Experience with connecting to Hadoop, Oracle Sybase, DB2, Netezza, Teradata, and SQL databases Knowledge on Data Science integration and application (Python, R) Knowledge on programming languages (SDK, API's, Java, JavaScript) Customizing Cognos and tableau URL's look and feel is plus. Excellent communication skills (must be able to interface with both technical and business leaders in the organization) Oversee and perform all system administration and change management responsibilities on the Tableau server, including server maintenance, patching, and hardware/software upgrades Experience in migrate tableau workbooks/data sources into higher environments. Expertise in install/configure Jasper report server on- premises and cloud environment Experience in deploy jasper report code from one environment to another environment Experience with install/configure Apache tomcat and knowledge on customization of system.xml and web.xml files.
Posted 10 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2