Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 12.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Description At American Express, our culture is built on a 175-year history of innovation, shared and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers digital lives.Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems.Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source.And because we understand the importance of keeping your skill fresh and relevant, we give you dedicated time to invest in your professional development.Find your place in technology on #TeamAmex. How will you make an impact in this role The Infrastructure Data & Analytics team unifies FinOps, Data Science and Business Intelligence to enable Technology cost transparency, infrastructure performance optimization and commercial efficiency for the enterprise through consistent, high-quality data and predictive analytics. This team within Global Infrastructure aims to establish and reinforce a culture of effective metrics, data-driven business processes, architecture simplification, and cost awareness. Metric-driven cost optimization, workload-specific forecasting and robust contract management are among the tools and practices required to drive accountability for delivering business solutions that derive maximum value. The result will provide a solid foundation for decision-making around cost, quality and speed. We are seeking an experienced Solutions Architect for our team who can support the design, development and enhancement of our data model and architecture. This DataOps team sets the foundation for how infrastructure utilization, consumption and asset inventory data is ingested, validated, and presented in our suite of reporting capabilities. The Solutions Architect will helpensure our data architecture integrates seamlessly with other Technology platforms and optimizes performance with best architecture standards.This individual will be responsible for setting the strategy for our target data architecture, collaborating across Tech and business partners, leveraging best practices for risk and data management, and optimizing cross-platform integration. Let's build on what you know. The role will require a unique blend of strong data ops technical and leadership skills to translate business decisions into data requirements. This individual will build a deep understanding of the infrastructure data we use in order to work across the ID&A team and key stakeholders the appropriate data to tell a data story. This includes designing and implementing a data architecture that follows enterprise architecture and data management best practices for ensuring data ingestion, transformation, storage and analytics are handled according to their specific purpose using the appropriate tools: ingestion captures raw data without applying business logic, transformation processes data discretely for auditability, and analytical queries retrieve structured outputs without relying on upstream processes. They will act as a key point of contact with upstream and downstream partner teams to ensure data is ingested and provided in a consistent and repeatable way. They will bring passion for data-driven decisions, enterprise solutions, and collaboration to the role,transforming platform data into actionable insights by utilizing data engineering and data visualization best practices. Key responsibilities include: Data Architecture: Perform all technical aspects of data architecture and solutions architecture for ID&A, including integrating data sources, defining appropriate data environments strategies, and enforcing API (Application Programming Interface) standardization as applicable Data Design: Translate logical data architectures into physical data designs, ensuring alignment with data modeling best practices and platform blueprints Data Process and Monitoring: Ensure proper data ingestion, validation, testing, and monitoring for ID&A data lake Leads design and code reviews to ensure the delivery of best-in-class data analytics and adherence to enterprise standards Data Migration: Design and support migration to a data model that is independent of the particular technologies or tools used for storing, processing, or accessing the data Data Integrity: Ensure the accuracy, completeness, and quality of data is maintained over time regardless of upstream systems or downstream Agile Methodologies: Act as the platform's single point of contact (SPOC) when coordinating with relevant stakeholders and supervising the data architecture implementation Identifies and implements technical solutions and business process improvements by analyzing business needs and leveraging innovative technologies Ensures the engineered environment meets the specification in terms of business requirements, application design and infrastructure requirements, i.e. accountable for the platform integration performance and efficiency Enterprise Thinking: Partner with data architects and engineers across Global Infrastructure and Asset Management teams to evolve the broader tech operations data architecture landscape Responsible for coaching and mentoring engineering resources on solution architecture, providing advice, mentorship and assistance to less experienced colleagues as required Contribute to decisions about new technologies, tools, methods and approaches Innovation: Leverage the evolving technical landscape as needed, including AI, Big Data, Machine Learning and other technologies to deliver meaningful business insights Ensures accurate asset information and architecture decisions are well-documented in the appropriate systems and repositories Minimum Requirements: 10+ years of solutions architecture, data architecture or DataOps engineering experiencein designing pipeline orchestration, data quality monitoring, governance, security processes, and self-service data access Prior experience in multiple IT disciplines with a confirmed understanding of architectural concepts (business, data, technical and solution) and real-time processing Extensive experience using a systems analysis and design methodology that is applicable to an agile product environment Advanced to authoritative level knowledge and understanding of solution architecture, complex application systems design and platform integration via modern approaches (i.e. RESTful APIs) Ability to perform system design reviews to ensure selection of appropriate technology, efficient use of resources, and alignment to strategic platform roadmaps Full understanding of Service Oriented Architecture design principles, execution patterns and performance optimization techniques Able to participate in the prevention, diagnosis, and resolution of system outages as a leader in the underlying architecture Hands-on coding experience in Python Hands-on expertise with design and development across one or more database management systems (e.g. GCP, SQL Server, PostgreSQL, Oracle) Strong analytical skills with a proven ability to understand and document business data requirements into complete, accurate, extensible and flexible logical data models, data visualization tools (e.g. Apptio BI, PowerBI) Ability to write efficient SQL queries to extract and manipulate data from relational databases, data warehouses and batch processing systems Fluent in data risk, management, and compliance terminology and best practices Proven track record for managing large, complex ecosystems with multiple stakeholders Self-starter who is able to problem-solve effectively, organize and document processes, and prioritize feature with limited guidance An enterprise mindset that connects the dots across various requirements and the broader operations/infrastructure data architecture landscape Excellent influential and collaboration skills ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross-team communication Understanding of complex software delivery including build, test, deployment, and operations conversant in AI, Data Science, and Business Intelligence concepts and technology stack Foundational Public Cloud (AWS, Google, Microsoft) certification advanced Public Cloud certifications a plus Experience with design and coding across multiple platforms and languages a plus Bachelor's Degree in computer science, computer science engineering, data engineering, or related field required advanced degree preferred We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 1 week ago
5.0 - 10.0 years
9 - 12 Lacs
bengaluru
Work from Office
Hiring GCP Data Engineers (3–5 yrs) & Senior GCP Data Engineers (5–8 yrs). Skills: Python, SQL, BigQuery, Dataflow, Pub/Sub, Composer. Location: Remote/Hybrid/Onsite.
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You are a Data Science Specialist with 4-5 years of experience in designing and implementing advanced analytical solutions. You should have a strong foundation in statistics and expertise in solving real-world business problems using Machine Learning and Data Science. Your track record should include building and deploying data products, and any exposure to NLP and Generative AI will be considered an advantage. Your key responsibilities will include collaborating with cross-functional teams to translate business problems into data science use cases, designing, developing, and deploying data science models, building and productionizing data products for measurable business impact, performing exploratory data analysis, feature engineering, model validation, and performance tuning, as well as applying statistical methods to uncover trends, anomalies, and actionable insights. You will need to implement scalable solutions using Python (or R/Scala), SQL, and modern data science libraries. It is important to stay updated with advancements in NLP and Generative AI and evaluate their relevance to internal use cases. Additionally, you should be able to communicate findings and recommendations clearly to both technical and non-technical stakeholders. Qualifications: - Bachelor's degree in a quantitative field such as Statistics, Computer Science, Mathematics, Engineering, or a related discipline is required. - A Master's degree or certifications in Data Science, Machine Learning, or Applied Statistics is a strong advantage. Experience: - 4-5 years of hands-on experience in data science projects across different domains. - Demonstrated experience in end-to-end ML model development, from problem framing to deployment. - Prior experience working with cross-functional business teams is highly desirable. Must-Have Skills: - Statistical Expertise: Strong understanding of hypothesis testing, regression, classification techniques, and distributions. - Business Problem Solving: Ability to translate ambiguous business challenges into data science use cases. - Model Development: Hands-on experience in building and validating machine learning models. - Programming Proficiency: Strong skills in Python (Pandas, NumPy, Scikit-learn, Matplotlib/Seaborn), and SQL. - Data Manipulation: Experience in handling structured/unstructured datasets, EDA, and data cleaning. - Communication: Ability to explain technical concepts to non-technical audiences. - Version Control & Collaboration: Familiarity with Git/GitHub and collaborative practices. - Deployment Mindset: Understanding how to build usable and scalable data products. Desirable Skills: - Experience with survival analysis or time-to-event modeling techniques. - Exposure to NLP methods like tokenization, embeddings, sentiment analysis. - Familiarity with Generative AI technologies like LLMs, transformers, prompt engineering. - Experience with MLOps tools, pipeline orchestration, or cloud platforms (AWS, GCP, Azure).,
Posted 3 weeks ago
6.0 - 8.0 years
18 - 30 Lacs
Hyderabad
Hybrid
Key Skills: Data engineering, Apache Airflow, GCP, BigQuery, GCS, SQL, ETL/ELT, Docker, Kubernetes, data governance, Agile, CI/CD, DevOps, pipeline orchestration, technical leadership. Roles & Responsibilities: Evaluate and provide scalable technical solutions to address complex and interdependent data processes. Ensure data quality and accuracy by implementing data quality checks, data contracts, and governance processes. Collaborate with software development teams and business analysts to understand data requirements and deliver fit-for-purpose data solutions. Lead the team in delivering end-to-end data engineering solutions. Design, develop, and maintain complex applications to support data processing workflows. Develop and manage data pipelines and workflows using Apache Airflow on GCP. Integrate data from various sources into Google BigQuery and Google Cloud Storage (GCS). Write and optimize advanced SQL queries for ETL/ELT processes. Maintain data consistency and troubleshoot issues in data workflows. Create and maintain detailed technical documentation for pipelines and workflows. Mentor junior data engineers and provide technical leadership and support. Lead project planning, execution, and successful delivery of data engineering initiatives. Stay updated with emerging trends and technologies in data engineering and cloud computing. Experience Requirement: 6-8 yeras of experience in leading the design, development, and deployment of complex data pipelines. Strong working knowledge of Apache Airflow on GCP for orchestration. Hands-on experience integrating data into Google BigQuery and GCS from various sources. Proficient in writing and optimizing complex SQL queries for large-scale data processing. Practical knowledge of containerization technologies like Docker and Kubernetes. Experience in implementing data governance and adhering to data security best practices. Familiarity with Agile methodology and working in cross-functional teams. Experience with CI/CD pipelines and DevOps practices for data engineering workflows. Education: B.Tech M.Tech (Dual), B.Tech, M. Tech.
Posted 1 month ago
10.0 - 15.0 years
35 - 50 Lacs
chennai
Hybrid
Key responsibilities include: Data Architecture: Perform all technical aspects of data architecture and database management for ID&A, including developing data pipelines, new database structures and APIs as applicable Data Design: Translate logical data architectures into physical data designs, ensuring alignment with data modeling best practices and standards Data Process and Monitoring: Ensure proper data ingestion, validation, testing, and monitoring for ID&A data lake Support database and data platform administration for initiatives building, enhancing, or maintaining databases, data warehouses and data pipelines Data Migration: Design and support migration to a data model that is independent of the particular technologies or tools used for storing, processing, or accessing the data Data Integrity: Ensure the accuracy, completeness, and quality of data is maintained over time regardless of upstream systems or downstream Agile Methodologies: Function as a senior member of an agile feature team and manage data assets as per the enterprise standards, guidelines and policies Partner closely with business intelligence team to capture and define data requirements for new and enhanced data visualizations Work with product teams to prioritize new features for ongoing sprints and manage backlog Enterprise Thinking: Partner with data architects and engineers across Global Infrastructure and Asset Management teams to evolve the broader tech operations data architecture landscape Mentor junior data engineers through ongoing development efforts and enforcement of SDLC standards Act as point of contact for data-related inquiries and data access requests Contribute to decisions about tools, methods and approaches Innovation: Leverage the evolving technical landscape as needed, including AI, Big Data, Machine Learning and other technologies to deliver meaningful business insights Minimum Requirements: 8+ years of DataOps engineering experience in implementing pipeline orchestration, data quality monitoring, governance, security processes, and self-service data access Experience managing databases, ETL/ELT pipelines, data lake architectures, and real-time processing Proficiency in API development and stream processing frameworks Hands-on coding experience in Python Hands-on expertise with design and development across one or more database management systems (e.g. SQL Server, PostgreSQL, Oracle) Testing and Troubleshooting: Ability to test, troubleshoot, and debug data processes Strong analytical skills with a proven ability to understand and document business data requirements in complete, accurate, extensible and flexible logical data models, data visualization tools (e.g. Apptio BI, PowerBI) Ability to write efficient SQL queries to extract and manipulate data from relational databases, data warehouses and batch processing systems Fluent in data risk, management, and compliance terminology and best practices Proven track record for managing large, complex ecosystems with multiple stakeholders Self-starter who is able to problem-solve effectively, organize and document processes, and prioritize feature with limited guidance An enterprise mindset that connects the dots across various requirements and the broader operations/infrastructure data architecture landscape Excellent influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross-team communication Understanding of complex software delivery including build, test, deployment, and operations; conversant in AI, Data Science, and Business Intelligence concepts and technology stack Foundational Public Cloud (AWS, Google, Microsoft) certification; advanced Public Cloud certifications a plus Experience working in technology business management, technology infrastructure or data visualization teams a plus Experience with design and coding across multiple platforms and languages a plus Bachelors Degree in computer science, computer science engineering, data engineering, or related field required; advanced degree preferred
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
71627 Jobs | Dublin
Wipro
26798 Jobs | Bengaluru
Accenture in India
22262 Jobs | Dublin 2
EY
20323 Jobs | London
Uplers
14624 Jobs | Ahmedabad
IBM
13848 Jobs | Armonk
Bajaj Finserv
13848 Jobs |
Accenture services Pvt Ltd
13066 Jobs |
Amazon
12516 Jobs | Seattle,WA
Capgemini
12337 Jobs | Paris,France