Home
Jobs
Companies
Resume

73 Netezza Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 6.0 years

55 - 60 Lacs

Pune

Work from Office

Naukri logo

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.

Posted 9 hours ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra

Remote

Indeed logo

Solution Engineer - Data & AI Mumbai, Maharashtra, India Date posted Jun 16, 2025 Job number 1830869 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Technology Specialists Employment type Full-Time Overview As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. OR 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 20 hours ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana

Remote

Indeed logo

Solution Engineer - Cloud & Data AI Gurgaon, Haryana, India Date posted Jun 16, 2025 Job number 1830866 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Technology Specialists Employment type Full-Time Overview As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Preffered 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Or 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and other cloud products (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Responsibilities Drive technical conversations with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 20 hours ago

Apply

7.5 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : IBM Netezza Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge. - Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Netezza. - Good To Have Skills: Experience with data warehousing concepts and practices. - Strong understanding of ETL processes and data integration techniques. - Familiarity with data modeling and database design principles. - Experience with performance tuning and optimization of data queries. Additional Information: - The candidate should have minimum 7.5 years of experience in IBM Netezza. - This position is based in Pune. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 3 days ago

Apply

2.0 - 3.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 2-3 years of IT experience in a Systems Administrator / Engineer role. Minimum 1 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 5 days ago

Apply

2.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 2-3 years of IT experience in a Systems Administrator / Engineer role. Minimum 1 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position: Database Location: Noida, India www.SEW.ai Who We Are: SEW, with its innovative and industry-leading cloud platforms, delivers the best Digital Customer Experiences (CX) and Workforce Experiences (WX), powered by AI, ML, and IoT Analytics to the global energy, water, and gas providers. At SEW, the vision is to Engage, Empower, and Educate billions of people to save energy and water. We partner with businesses to deliver platforms that are easy-to-use, integrate seamlessly, and help build a strong technology foundation that allows them to become future- ready. Searching for your dream job? We are a true global company that values building meaningful relationships and maintaining a passionate work environment while fostering innovation and creativity. At SEW, we firmly believe that each individual contributes to our success and in return, we provide opportunities from them to learn new skills and build a rewarding professional career. A Couple of Pointers: • We are the fastest growing company with over 420+ clients and 1550+ employees. • Our clientele is based out in the USA, Europe, Canada, Australia, Asia Pacific, Middle East • Our platforms engage millions of global users, and we keep adding millions every month. • We have been awarded 150+ accolades to date. Our clients are continually awarded by industry analysts for implementing our award-winning product. • We have been featured by Forbes, Wall Street Journal, LA Times for our continuous innovation and excellence in the industry. Who we are looking? An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. T his individual will be responsible for helping the design, development and implementation of new and existing applications. Roles and Responsibilities: • Reviews the existing database design and data management procedures and provides recommendations for improvement • Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities. •Develop technical documentation as needed. • Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools. • Define data architecture requirements for cross-product integration within and across cloud-based platforms. • Analyze, architect, develop, validate and support integrating data into the SEW platform from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS. • Perform thorough analysis of complex data and recommend actionable strategies. • Effectively translate data modeling and BI requirements into the design process. • Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling • Required Skills: • Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. • 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, Datastage, etc. • 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes. • Candidate should have any NoSql technology exposure preferably MongoDB. • Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.). • Understanding data warehousing concepts and decision support systems. • Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements. • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc. • Excellent communication and collaboration skill Show more Show less

Posted 5 days ago

Apply

0 years

4 - 7 Lacs

Chennai

On-site

Consulting Manager JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Responsibility Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools . provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching. Primary Skills knowledge specially in Risk or finance technology area Strong SQL knowledge, involving complex joins and analytical functions Good understanding of Data Flow , Data Model and database applications working knowledge of Databases like Oracle and Netezza Secondary Skills Conceptual knowledge of ETL and datawarehousing, working knowledge is added advantage Basic knowledge of Java is added advantage JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Responsibility Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools . provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 6 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Consulting Manager JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Responsibility Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools . provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching. Primary Skills knowledge specially in Risk or finance technology area Strong SQL knowledge, involving complex joins and analytical functions Good understanding of Data Flow , Data Model and database applications working knowledge of Databases like Oracle and Netezza Secondary Skills Conceptual knowledge of ETL and datawarehousing, working knowledge is added advantage Basic knowledge of Java is added advantage JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Responsibility Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools . provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Sr. Software Development Engineer (Big Data Engineer) Overview Job Description Summary Mastercard is a technology company in the Global Payments Industry. We operate the world’s fastest payments processing network, connecting consumers, financial institutions, merchants, governments and businesses in more than 210 countries and territories. Mastercard products and solutions make everyday commerce activities – such as shopping, travelling, running a business and managing finances – easier, more secure and more efficient for everyone. Mastercard’s Data & Services team is a key differentiator for MasterCard, providing cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this dynamic team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services. We are currently seeking a Software Development Engineer-II for Location Program within the Data & Services group. You will own end-to-end delivery of engineering projects for some of our analytics and BI solutions that leverage Mastercard dataset combined with proprietary analytics techniques, to help businesses around the world solve multi-million dollar business problems. Roles And Responsibilities Work as a member of support team to resolve issues related to product, should have good troubleshooting skills and good knowledge in support work. Independently apply problem solving skills to identify symptoms and root causes of issues. Make effective and efficient decisions even when data is ambiguous. Provide technical guidance, support and mentoring to more junior team members. Make active contributions to improvement decisions and make technology recommendations that balance business needs and technical requirements. Proactively understand stakeholder needs, goals, expectations and viewpoints, to deliver results. Ensure design thinking accounts for long term maintainability of code. Thrive in a highly collaborative company environment where agility is paramount. Stay up to date with latest technologies and technical advancements through self-study, blogs, meetups, conferences, etc. Perform system maintenance, production incident problem management, identification of root cause & issue remediation. All About You Bachelor's degree in Information Technology, Computer Science or Engineering or equivalent work experience, with a proven track-record of successfully delivering on complex technical assignments. A solid foundation in Computer Science fundamentals, web applications and microservices-based software architecture. Full-stack development experience, including , Databases (Oracle, Netezza, SQL Server), Hands-on experience with Hadoop, Python, Impala, etc,. Excellent SQL skills, with experience working with large and complex data sources and capability of comprehending and writing complex queries. Experience working in Agile teams and conversant with Agile/SAFe tenets and ceremonies. Strong analytical and problem-solving abilities, with quick adaptation to new technologies, methodologies, and systems. Excellent English communication skills (both written and verbal) to effectively interact with multiple technical teams and other stakeholders. High-energy, detail-oriented and proactive, with ability to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-240980 Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Build the future of the AI Data Cloud. Join the Snowflake team. We are looking for a Solutions Architect to be part of our Professional Services team to deploy cloud products and services for our customers. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences. The person we’re looking for shares our passion about reinventing the data platform and thrives in the dynamic environment. That means having the flexibility and willingness to jump in and get done what needs to be done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving technologies for data and analytics in order to be an authoritative resource for Snowflake, System Integrators and customers. And it means working collaboratively with a broad range of people both inside and outside the company. AS A SOLUTIONS ARCHITECT AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake Guide customers through the process of migrating to Snowflake and develop methodologies to improve the migration process Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE: Minimum 10 years of experience working with customers in a pre-sales or post-sales technical role Experience migrating from one data platform to another and holistically addressing the unique challenges of migrating to a new platform University degree in computer science, engineering, mathematics or related fields, or equivalent experience Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools Strong skills in databases, data warehouses, and data processing Extensive hands-on expertise with SQL and SQL analytics Experience and track record of success selling data and/or analytics software to enterprise customers; includes proven skills identifying key stakeholders, winning value propositions, and compelling events Extensive knowledge of and experience with large-scale database technology (e.g. Netezza, Exadata, Teradata, Greenplum, etc.) Software development experience with C/C++ or Java.Scripting experience with Python, Ruby, Perl, Bash. Ability and flexibility to travel to work with customers on-site BONUS POINTS FOR THE FOLLOWING: Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Business Objects, Tableau) Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure, OpenStack, etc.) Experience implementing ETL pipelines using custom and packaged tools Experience using AWS services such as S3, Kinesis, Elastic MapReduce, Data pipeline Experience selling enterprise SaaS software Proven success at enterprise software WHY JOIN OUR PROFESSIONAL SERVICES TEAM AT SNOWFLAKE? Unique opportunity to work on a truly disruptive software product Get unique, hands-on experience with bleeding edge data warehouse technology Develop, lead and execute an industry-changing initiative Learn from the best! Join a dedicated, experienced team of professionals. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers.  3+ years direct experience working in IT Infrastructure  2+ years in a customer facing role working with enterprise clients.  Experience with implementing and/or maintaining technical solutions in virtualized environments.  Experience in design, architecture and implementation of Data warehouses, data pipelines and flows.  Experience with developing software code in one or more languages such as Java and Python. Proficient with SQL  Experience designing and deploying large scale distributed data processing systems with one or more technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy.  Customer facing migration experience, including service discovery, assessment, planning, execution, and operations.  Demonstrated excellent communication, presentation, and problem-solving skills.  Mandatory Certifications Required Google Cloud Professional Data Engineer Mandatory skill sets-GCP Architecture/Data Engineering, SQL, Python Preferred Skill Sets-GCP Architecture/Data Engineering, SQL, Python Year of experience required-8-10 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description AWS Fintech team is looking for a Data Engineering Manager to transform and optimize high-scale, world class financial systems that power the global AWS business. The success of these systems will fundamentally impact the profitability and financial reporting for AWS and Amazon. This position will play an integral role in leading programs that impact multiple AWS cost optimization initiatives. These programs will involve multiple development teams across diverse organizations to build sophisticated, highly reliable financial systems. These systems enable routine finance operations as well as machine learning, analytics, and GenAI reporting that enable AWS Finance to optimize profitability and free cash flow. This position requires a proactive, highly organized individual with an aptitude for data-driven decision making, a deep curiosity for learning new systems, and collaborative skills to work with both technical and financial teams. Key job responsibilities Build and lead a team of data engineers, application development engineers, and systems development engineers Drive execution of data engineering programs and projects Help our leadership team make challenging decisions by presenting well-reasoned and data-driven solution proposals and prioritizing recommendations. Identify and execute on opportunities for our organization to move faster in delivering innovations to our customers. This role has oncall responsibilities. A day in the life The successful candidate will build and grow a high-performing data engineering team to transform financial processes at Amazon. The candidate will be curious and interested in the capabilities of Large Language Model-based development tools like Amazon Q to help teams accelerate transformation of systems. The successful candidate will begin with execution to familiarize themselves with the space and then construct a strategic roadmap for the team to innovate. You thrive and succeed in an entrepreneurial environment, and are not hindered by ambiguity or competing priorities. You thrive driving strategic initiatives and also dig in deep to get the job done. About The Team The AWS FinTech team enables the growth of earth’s largest cloud provider by building world-class finance technology solutions for effective decision making. We build scalable long-term solutions that provide transparency into financial business insights while ensuring the highest standards of data quality, consistency, and security. We encourage a culture of experimentation and invest in big ideas and emerging technologies. We are a globally distributed team with software development engineers, data engineers, application developers, technical program managers, and product managers. We invest in providing a safe and welcoming environment where inclusion, acceptance, and individual values are honored. Basic Qualifications Experience managing a data or BI team 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience 5+ years of data engineering experience Experience communicating to senior management and customers verbally and in writing Experience leading and influencing the data or BI strategy of your team or organization Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Preferred Qualifications Knowledge of software development life cycle or agile development environment with emphasis on BI practices Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2961772 Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Data Engineer - Google Cloud  5+ years direct experience working in Enterprise Data Warehouse technologies.  5+ years in a customer facing role working with enterprise clients.  Experience with implementing and/or maintaining technical solutions in virtualized environments.  Experience in design, architecture and implementation of Data warehouses, data pipelines and flows.  Experience with developing software code in one or more languages such as Java, Python and SQL.  Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy.  Customer facing migration experience, including service discovery, assessment, planning, execution, and operations.  Demonstrated excellent communication, presentation, and problem-solving skills.  Mandatory Certifications Required Google Cloud Professional Cloud Architect Or Google Cloud Professional Data Engineer + AWS Big Data Specialty Certification Mandatory skill sets-GCP Data Engineering, SQL, Python Preferred Skill Sets-GCP Data Engineering, SQL, Python Year of experience required-4-8 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

3.0 years

5 - 8 Lacs

Hyderābād

On-site

We are seeking an experienced and motivated Data Engineer to join our team. In this role, you will design, build, and maintain scalable data solutions to support critical business needs. You will work with distributed data platforms, cloud infrastructure, and modern data engineering tools to enable efficient data processing, storage, and analytics. The role includes participation in an on-call rotation to ensure the reliability and availability of our systems and pipelines Key Responsibilities Data Platform Development : Design, develop, and maintain data pipelines and workflows on distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, or Teradata. Cloud Integration: Build and optimize cloud-based solutions using AWS or GCP to process and store large-scale datasets. Workflow Orchestration: Design and manage workflows and data pipelines using Apache Airflow to ensure scalability, reliability, and maintainability. Containerization and Orchestration : Deploy and manage containerized applications using Kubernetes for efficient scalability and resource management. Event Streaming : Work with Kafka to implement reliable and scalable event streaming systems for real-time data processing. Programming and Automation : Write clean, efficient, and maintainable code in Python and SQL to automate data processing, transformation, and analytics tasks. Database Management : Design and optimize relational and non-relational databases to support high-performance querying and analytics. System Monitoring & Troubleshooting: Participate in the on-call rotation to monitor systems, address incidents, and ensure the reliability of production environments. Collaboration : Work closely with cross-functional teams, including data scientists, analysts, and product managers, to understand data requirements and deliver solutions that meet business objectives. Participate in code reviews, technical discussions, and team collaboration to deliver high-quality software solutions. This role includes participation in an on-call rotation to ensure the reliability and performance of production systems: Rotation Schedule : Weekly rotation beginning Tuesday at 9:00 PM PST through Monday at 9:00 AM PST. Responsibilities During On-Call : Monitor system health and respond to alerts promptly. Troubleshoot and resolve incidents to minimize downtime. Escalate issues as needed and document resolutions for future reference. Requirements: Primary Technologies: Big Query or other distributed data platform, for example, Big Data (Hadoop/EMR/DataProc), SnowFlake, Teradata, or Netezza, ASW, GCP, Kubernetes, Kafka Python, SQL Bachelor’s degree in computer science, Engineering, or a related field (or equivalent work experience). 3+ years of experience in data engineering or related roles. Hands-on experience with distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, Snowflake, or Teradata. Proficiency in Apache Airflow for building and orchestrating workflows and data pipelines. Proficiency in Python and SQL for data processing and analysis. Experience with cloud platforms like AWS or GCP, including building scalable solutions. Familiarity with Kubernetes for container orchestration. Knowledge of Kafka for event streaming and real-time data pipelines. Strong problem-solving skills and ability to troubleshoot complex systems. Excellent communication and collaboration skills to work effectively in a team environment. Preferred Familiarity with CI/CD pipelines for automated deployments. Knowledge of data governance, security, and compliance best practices. Experience with DevOps practices and tools. We have a global team of amazing individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast growth fintech, and multiple Silicon Valley startups. What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 (QMS), ISO 27001:2022 (ISMS), ISO 20000-1:2018 (ITSM) and ISO 14001:2015 (EMS) Certified. We have a vibrant culture of learning via collaboration and making workplace fun. People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves. To know more about Confiz Limited, visit https://www.linkedin.com/company/confiz/

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderābād

On-site

- 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience - 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience - 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience - 5+ years of data engineering experience - Experience managing a data or BI team - Experience communicating to senior management and customers verbally and in writing - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Do you pioneer? Do you enjoy solving complex problems in building and analyzing large datasets? Do you enjoy focusing first on your customer and working backwards? Amazon transportation controllership team is looking for an experienced Data Engineering Manager with experience in architecting large/complex data systems with a strong record of achieving results, scoping and delivering large projects end-to-end. You will be the key driver in building out our vision for scalable data systems to support the ever-growing Amazon global transportation network businesses. Key job responsibilities As a Data Engineering Manager in Transportation Controllership, you will be at the forefront of managing large projects, providing vision to the team, designing and planning large financial data systems that will allow our businesses to scale world-wide. You should have deep expertise in the database design, management, and business use of extremely large datasets, including using AWS technologies - Redshift, S3, EC2, Data-pipeline and other big data technologies. Above all you should be passionate about data warehousing large datasets together to answer business questions and drive change. You should have excellent business acumen and communication skills to be able to work with multiple business teams, and be comfortable communicating with senior leadership. Due to the breadth of the areas of business, you will coordinate across many internal and external teams, and provide visibility to the senior leaders of the company with your strong written and oral communication skills. We need individuals with demonstrated ability to learn quickly, think big, execute both strategically and tactically, motivate and mentor their team to deliver business values to our customers on time. A day in the life On a daily basis you will: • manage and help GROW a team of high performing engineers • understand new business requirements and architect data engineering solutions for the same • plan your team's priorities, working with relevant internal/external stakeholders, including sprint planning • resolve impediments faced by the team • update leadership as needed • use judgement in making the right tactical and strategic decisions for the team and organization • monitor health of the databases and ingestion pipelines Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

At Broadridge, we've built a culture where the highest goal is to empower others to accomplish more. If you’re passionate about developing your career, while helping others along the way, come join the Broadridge team. 8+ Years designing, developing, and administering IBM Cognos 11.1.x application. Cognos 11.1.x upgrade experience required. Installing hot fixes / service packs to the existing version of Cognos Analytics. Experience with Motio CI integration with Cognos Analytics. Knowledge on Cognos SDK and Cognos life cycle manager is a plus. Hands on experience on granular level Cognos security customization and installing any third partly tools. Hands on experience in Cognos Framework manager install and Configuration. Experience with Publishing packages and customize the package acccess as per requirement. Knowledge on Cognos TM1 is Plus. Experience with Cognos/Tableau installation and configuration in AWS Responsible in troubleshooting, resolving Cognos Analytics/tableau issues, open service requests with Cognos vendor, work with different teams providing recommendations, driving standards, plan and execute effective transition on development and production operations. Deployment of Cognos in a clustered environment and performing upgrades Implement and document best practices for a Cognos Environment. Experience in Windows/Linux based operating system environment and well versed in Linux OS commands. Experience should include maintenance and support activities, performance monitoring and tuning, upgrading versions, software configuration, business continuity and disaster recovery planning, and general IT processes such as Change Management, Configuration Management, Problem Resolution, and Incident Tracking required. Ability to cross-train team. Implementation of proactive Cognos environment health checks. Hands on Cognos user groups, security, and user entitlement administration Experience with Cognos User LDAP/Active Directory Integration /Synchronization preferred Experience with IIS 7.5 or higher is plus. Integrate Cognos with SharePoint portal/ team is a plus. Ability to provide 24 by 7 production support for Cognos in an on- rotation with excellent communication skills required Any other BI tool experience such as Tableau/Jaspersoft/Crystal is a plus Experience with industry BI/Reporting toolsets including Tableau, Jaspersoft, Cognos, Power BI, and Crystal. Tableau 2022.1.x upgrade experience required. Knowledge on Jasper report server upgrade 6.2 to 8.1 version is plus Experience with connecting to Hadoop, Oracle Sybase, DB2, Netezza, Teradata, and SQL databases Knowledge on Data Science integration and application (Python, R) Knowledge on programming languages (SDK, API's, Java, JavaScript) Customizing Cognos and tableau URL's look and feel is plus. Excellent communication skills (must be able to interface with both technical and business leaders in the organization) Oversee and perform all system administration and change management responsibilities on the Tableau server, including server maintenance, patching, and hardware/software upgrades Experience in migrate tableau workbooks/data sources into higher environments. Expertise in install/configure Jasper report server on- premises and cloud environment Experience in deploy jasper report code from one environment to another environment Experience with install/configure Apache tomcat and knowledge on customization of system.xml and web.xml files. Show more Show less

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

TCS has been a great pioneer in feeding the fire of young techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. What we are looking for Role: SQL DBA Experience Range: 6 - 12 years Location: New Delhi / Gurugram Interview Mode: Saturday Virtual Drive Must Have: Minimum 5 mandate details are mandate with two or 3 liners 1. MSSQL Server 2. Azure SQL Server 3. Must have done certifications in SQL Server / Azure SQL Good to Have: Minimum 2 mandate details are mandate with two or 3 liners 1. DB2 2. Netezza 3. PowerShell 4. Azure PostgreSQL Essential: Administer and maintain database systems, with a focus on MS SQL Server along with Azure, PostgreSQL, and DB2. Supporting SQL server in Azure environment as IAAS/SQL MI/PaaS services. Managing Azure SQL databases, SQL Managed Instances, Azure VM in Azure Portal. Monitor database performance and proactively address issues to ensure optimal functionality. Collaborate with project teams to understand database requirements and provide efficient solutions. Participate in the design, implementation, and maintenance of database structures for different applications. Work independently to troubleshoot and resolve database-related issues promptly. Implement best practices to enhance database performance and security Manage databases on Azure Cloud, ensuring seamless integration and optimization for cloud-based solutions Utilize SQL Server tools and other relevant technologies for effective database administration Stay updated on the latest advancements in database tools and incorporate them into daily practices. Collaborate with cross-functional teams, including developers and system administrators, to achieve project goals and Provide guidance and support to team members on database-related issues. Minimum Qualification: •15 years of full-time education •Minimum percentile of 50% in 10th, 12th, UG & PG (if applicable) Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Are you looking for a career move that will put you at the heart of a global financial institution? Then bring your skills in Dashboard applications to our Markets Operations technology Team. By Joining Citi, you will become part of a global organization whose mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. Team/Role Overview Our Markets Operations Technology Team are a global team which provides exposure to all countries such as India, America and UK. The role of this team is to manage the end to end processing of a case within Dashboard Applications. What You’ll Do The Applications Development Senior Dashboard Programmer/Developer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. This role requires deep expertise in system design, hands-on coding, and strong problem-solving skills to create resilient, high-performing, and secure applications. What We’ll Need From You Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Ensure essential procedures are followed and help define operating standards and processes Acts as SME to senior stakeholders and /or other team members. Drive the adoption of modern engineering ways of working, including Agile, DevOps, and CI/CD. Advocate for automated testing, infrastructure as code, and continuous monitoring to enhance software reliability. Apply Behavior-Driven Development (BDD), Test-Driven Development (TDD), and unit testing to ensure code quality and functionality. Conduct thorough code reviews, ensuring adherence to best practices in readability, performance, and security. Implement and enforce secure coding practices, performing vulnerability assessments and ensuring compliance with security standards. Responsibilities : Candidate should have 8+ years of overall experience that includes 2+ years in the financial services industry (preferably in investment banks). Ideal candidate would be a self contained individual contributor with a go getter attitude to develop software meeting the laid down quality metrics within the project environment. The candidate would have prior working experience in a competitive, high paced environment to deliver software to meet business needs. Relevant Technologies: QlikView, Qlik Sense, Tableau, NPrinting, JScript, HTML and Netezza Technically the Dashboards are built on Qlik View and Qlik Sense and Netezza at the backend. NPrinting are used to generate and send user reports as mail attachment. Experience with high performance & high volume Integrated Dashboard Development and database performance tuning. Strong Qlik View and Qlik Sense Knowledge and experience of using Mashup to build Dashboards is a must Knowledge of design methodologies Display sound analytical, problem solving, presentation and inter-personal skills to handle various critical situations. Ability to carry out adaptive changes necessitated by changes in business requirements and technology. Post trade processing experience; Familiarity with trade life cycle and associated business processes. The role would be based in Pune to drive client interfacing with business and operations to drive new product onboarding into the current platform. The person would be responsible for understanding business requirements and interact with upstream systems. The candidate is expected to deliver new products to be included and enhancement on the existing product for more coverage from the various regions feeds and markets to be covered. Support and manage the existing code base. The candidate must have strong desire to learn, commitment towards roles & responsibilities and zeal to hard work in order to be perfect fit for the team. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

🚀 We’re Hiring: Informatica Developer (6–9 Years Experience) | Chennai (Work From Office) 🚀 Are you ready to take your ETL expertise to the next level? Join our client , a leading IT solutions provider, as we look for Informatica professionals with 6–9 years of hands-on experience for an immediate opportunity in Chennai . 🔍 Role Overview As a key member of our data integration team, you'll design, develop, test, and support robust ETL solutions using Informatica and related tools. You'll play a critical role in driving data strategy for top-tier financial services clients. 📍 Location: Chennai (Work From Office) 🕒 Notice Period: Immediate joiners or candidates with up to 30 days notice only ✅ Important: Background verification, PF account, and no dual employment are mandatory 🛠️ Interview Process: Multiple technical rounds by our client ✅ What We’re Looking For: Must-Have Skills: 6–9 years in ETL/ Informatica (Axway or similar tools also considered) Strong SQL skills for data analysis and validation Proficiency in Oracle, Netezza , and data warehouse/lake environments Automation experience using Java frameworks ; scripting in Python, Unix, Shell Experience with DevOps tools : Jenkins, UDeploy, Concourse, CI/CD pipelines Familiarity with Cloud platforms like AWS, Azure , and Snowflake Strong understanding of SDLC/STLC , defect tracking, and Agile methodologies Excellent communication and coordination with business & tech stakeholders Good-to-Have Skills: Exposure to BI/reporting tools like Power BI, OBIEE, Tableau Experience with Atscale (semantic layer platform) Hands-on with data test automation tools like iCEDQ 👤 Who Should Apply? Professionals with a passion for data integration and testing Candidates who thrive in fast-paced, Agile environments Individuals ready to work from our Chennai office and join immediately Ready to make an impact with our client? Apply now or tag someone who fits this role! #HiringNow #InformaticaJobs #ETLDeveloper #ChennaiJobs #DataIntegration #SQL #CloudData #DevOps #ImmediateJoiners #6to9YearsExperience #strive4x #OGI Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Build the future of the AI Data Cloud. Join the Snowflake team. We are looking for a Solutions Architect to be part of our Professional Services team to deploy cloud products and services for our customers. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences. The person we’re looking for shares our passion about reinventing the data platform and thrives in the dynamic environment. That means having the flexibility and willingness to jump in and get done what needs to be done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving technologies for data and analytics in order to be an authoritative resource for Snowflake, System Integrators and customers. And it means working collaboratively with a broad range of people both inside and outside the company. AS A SOLUTIONS ARCHITECT AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake Guide customers through the process of migrating to Snowflake and develop methodologies to improve the migration process Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE: Minimum 10 years of experience working with customers in a pre-sales or post-sales technical role Experience migrating from one data platform to another and holistically addressing the unique challenges of migrating to a new platform University degree in computer science, engineering, mathematics or related fields, or equivalent experience Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools Strong skills in databases, data warehouses, and data processing Extensive hands-on expertise with SQL and SQL analytics Experience and track record of success selling data and/or analytics software to enterprise customers; includes proven skills identifying key stakeholders, winning value propositions, and compelling events Extensive knowledge of and experience with large-scale database technology (e.g. Netezza, Exadata, Teradata, Greenplum, etc.) Software development experience with C/C++ or Java Scripting experience with Python, Ruby, Perl, Bash Ability and flexibility to travel to work with customers on-site BONUS POINTS FOR THE FOLLOWING: Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Business Objects, Tableau) Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure, OpenStack, etc.) Experience implementing ETL pipelines using custom and packaged tools Experience using AWS services such as S3, Kinesis, Elastic MapReduce, Data pipeline Experience selling enterprise SaaS software Proven success at enterprise software WHY JOIN OUR PROFESSIONAL SERVICES TEAM AT SNOWFLAKE? Unique opportunity to work on a truly disruptive software product Get unique, hands-on experience with bleeding edge data warehouse technology Develop, lead and execute an industry-changing initiative Learn from the best! Join a dedicated, experienced team of professionals. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less

Posted 2 weeks ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Are you looking for a career move that will put you at the heart of a global financial institution? Then bring your skills in Dashboard applications to our Markets Operations technology Team. By Joining Citi, you will become part of a global organization whose mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. Team/Role Overview Our Markets Operations Technology Team are a global team which provides exposure to all countries such as India, America and UK. The role of this team is to manage the end to end processing of a case within Dashboard Applications. What You’ll Do The Applications Development Senior Dashboard Programmer/Developer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. This role requires deep expertise in system design, hands-on coding, and strong problem-solving skills to create resilient, high-performing, and secure applications. What We’ll Need From You Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Ensure essential procedures are followed and help define operating standards and processes Acts as SME to senior stakeholders and /or other team members. Drive the adoption of modern engineering ways of working, including Agile, DevOps, and CI/CD. Advocate for automated testing, infrastructure as code, and continuous monitoring to enhance software reliability. Apply Behavior-Driven Development (BDD), Test-Driven Development (TDD), and unit testing to ensure code quality and functionality. Conduct thorough code reviews, ensuring adherence to best practices in readability, performance, and security. Implement and enforce secure coding practices, performing vulnerability assessments and ensuring compliance with security standards. Responsibilities : Candidate should have 12+ years of overall experience with 8+ years of relevant experience in to Tableau / Qlik sense that includes 2+ years in the financial services industry (preferably in investment banks). Ideal candidate would be a self contained individual contributor with a go getter attitude to develop software meeting the laid down quality metrics within the project environment. The candidate would have prior working experience in a competitive, high paced environment to deliver software to meet business needs. The candidate will be responsible for migrating existing dashboards from Qlik Sense to Tableau as part of strategic initiative. Relevant Technologies: QlikView, Qlik Sense, Tableau, JScript, HTML and Netezza Technically the Dashboards are built on Qlik Sense and Netezza at the backend. Experience with high performance & high volume Integrated Dashboard Development and database performance tuning. Strong Qlik Sense Knowledge and experience of using Mashup to build Dashboards is a must Experience in migrating existing Qlik sense to Tableau will be an added advantage Knowledge of design methodologies Display sound analytical, problem solving, presentation and inter-personal skills to handle various critical situations. Ability to carry out adaptive changes necessitated by changes in business requirements and technology. Post trade processing experience; Familiarity with trade life cycle and associated business processes. The role would be based in Pune to drive client interfacing with business and operations to drive new product onboarding into the current platform. The person would be responsible for understanding business requirements and interact with upstream systems. The candidate is expected to deliver new products to be included and enhancement on the existing product for more coverage from the various regions feeds and markets to be covered. Support and manage the existing code base. The candidate must have strong desire to learn, commitment towards roles & responsibilities and zeal to hard work in order to be perfect fit for the team. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Citi Global Functions Technology is a world-class technology group employing an innovative, disciplined, and business focused approach to developing a wide variety of products and solutions. Across many diverse technology hubs worldwide, our 14k+ technologists design, build and deploy technology solutions for business stakeholders across Risk, Finance, Compliance and HR domains. Citi is going through a major transformation program to improve its overall Financial Management by implementing industry standard cloud-based software platform. Looking for a Business Analyst who has strong experience working with technical teams on AbInitio, ETL, Data Warehousing in strong partnership with Finance stakeholders. Responsibilities : Leading business requirements for buildout of Engineering tools using Ab Initio, Metadata hub, Tricentis Tosca, Unix and Oracle. This role is an Individual contributor role that owns all business requirements for data engineering utilities, enabling rollout and ensure standard GFT processes are deployed for the successful go-live of strategic ledger for a key project. Performing initiatives related to System Business Analysis, Functional Testing, SIT, User Acceptance Testing (UAT) process and product rollout into production. You will be a BA specialist who works with technology project managers, UAT professionals and users to design and implement appropriate scripts/plans for an application testing strategy/approach. Responsibilities may also include defining business requirements software quality assurance testing. Resolves complex and highly variable issues. Analyses trends at an organizational level to improve processes; follows and analyses industry trends. Document Designs standards and procedures; ensures that they are adhered to throughout the software development life cycle. Manages organizational process change. Develops and implements methods for cost, effort and milestones of IT Quality activities. Strives for continuous improvements and streamlining of processes. Ensures consistency and quality of processes across the organization. Exhibits in-depth understanding of concepts and procedures within own area and basic knowledge of these elements in other areas. Requires in-depth understanding of how own area integrates within IT Quality and has basic commercial awareness. Responsible for budgeting, project estimates, task management & balancing prioritization across multiple streams of development. Collaborates with local and global stakeholders like QA team, production support team, environment management team, DBA team, etc. to ensure project stability and productivity. Experience with Citi implementations, Oracle configuration. Performs Other Duties And Functions As Assigned. Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients and assets. Qualifications : Relevant experience in software business analysis or IT covering Finance Technology in Financial Services. Relevant experience in leading business requirements for development for enterprise scale platforms, products, or frameworks preferably using Oracle/Netezza/Teradata. Strong experience in analyzing and communicate complete data problems with tools available at Citi. Knowledge of any well-known software development and testing life-cycle methodology. Requires communication and diplomacy skills and an ability to persuade and influence Adopting a standard process to ensure all test cases coming from key stakeholders are being received, reviewed, and validated in consistent fashion. Experience in all aspects of Data namely Reconciliations, Data Comparison, Data Quality and Data Security for SAAS or Cloud based platform. Demonstrated experience in collaborating with different teams to ensure proactive Cross application/downstream impact analysis, responsible for creating test plans and strategy across multiple business critical applications in Finance. Experience with existing Citi applications, implementations and Configurations. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Business Analysis / Client Services ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Senior Data Engineer will be Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 8- 12 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Show more Show less

Posted 2 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Roles & Responsibilities Job Description: Build pipelines to bring in wide variety of data from multiple sources within the organization as well as from social media and public data sources. Collaborate with cross functional teams to source data and make it available for downstream consumption. Work with the team to provide an effective solution design to meet business needs. Ensure regular communication with key stakeholders, understand any key concerns in how the initiative is being delivered or any risks/issues that have either not yet been identified or are not being progressed. Ensure dependencies and challenges (risks) are escalated and managed. Escalate critical issues to the Sponsor and/or Head of Data Engineering. Ensure timelines (milestones, decisions and delivery) are managed and value of initiative is achieved, without compromising quality and within budget. Ensure an appropriate and coordinated communications plan is in place for initiative execution and delivery, both internal and external. Ensure final handover of initiative to business as usual processes, carry out a post implementation review (as necessary) to ensure initiative objectives have been delivered, and any lessons learned are fed into future initiative management processes. Who We Are Looking For Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Knowledge of Big data ETL processing tools Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza). Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Working Experience 1-3 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Professional Qualifications Certifications related to Data and Analytics would be an added advantage Education Master/Bachelor’s degree in STEM (Science, Technology, Engineering, Mathematics) Language Fluency in written and spoken English Experience 3-4.5 Years Skills Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Kafka, Big Data, Apache Hive, SQL Server DBA, CI/CD, Apache Spark About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru. Show more Show less

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies