Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 4.0 years
0 Lacs
Hyderabad, Telangana
Remote
Cust Experience Engineer 2 Hyderabad, Telangana, India Date posted Jul 29, 2025 Job number 1850854 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Program Management Discipline Customer Experience Engineering Employment type Full-Time Overview We are the ACES Strategic team (Advanced Cloud Engineering & Supportability), a global engineering team in Azure CXP and we are focused on Strategic Azure Customers. We are customer-obsessed problem-solvers. We orchestrate and drive deep engagements in areas like Incident Management, Problem Management, Support, Resiliency, and empowering the customers. We represent the customer and amplify customer voice with Azure Engineering connecting to the quality vision for Azure. We innovate and find ways to scale our learning across our customer base. Diversity and inclusion are central to who we are, how we work, and what we enable our customers to achieve. We know that empowering our customers starts with empowering our team to show up authentically, work in ways that are best for them, and achieve their career goals. Every minute of every day, customers stake their entire business and reputation on the Microsoft Cloud. The Azure Customer Experience (CXP) team believes that when we meet our high standards for quality and reliability, our customers win. If we falter, our customers fail their end-customers. Our vision is to turn Microsoft Cloud customers into fans. Are you constantly customer-obsessed and passionate about solving complex technical problems? Do you take pride in enhancing customer experience through innovation? If the answer is Yes, then join us and surround yourself with people who are passionate about cloud computing and believe that extraordinary support is critical to customer success. As a customer focused Advanced Cloud Engineer, you are the primary engineering contact accountable for your customer’s support experience on Azure. You will drive resolution of critical and complex problems, support key customer projects on Azure and be the voice of the customer within Azure. In this role, you will work in partnership with Customer Success Account Managers, Cloud Solution Architects, Technical Support Engineers, and Azure engineering with our mission to turn Azure customers into fans with world-class engineering-led support experience. This role is flexible in that you can work up to 50% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor’s degree in engineering, Computer Science, or related field AND 6+ years of experience in Software industry experience related to technology OR equivalent experience. 4 years of demonstrated IT experience supporting and troubleshooting enterprise level, mission-critical applications resolving complex issues/situations and driving technical resolution across cross-functional organizations. 2+ years experience in an external customer / client facing role. 2+ years of experience working on cloud computing technologies. Experience with being on-call. Technical Skills: Cloud computing technologies. Demonstrated hands on experience in one or more of the following: Core IaaS: Compute, Storage, Networking, High Availability. Data Platform and Bigdata: SQL Server, Azure SQL DB, HDInsight/Hadoop, Machine Learning, Azure Stream Analytics, Azure Data Factory / Data Bricks. Azure PaaS Services: Redis Cache, Service Bus, Event Hub, Cloud Service, IoT suite, Mobile Apps, etc. Experience in Monitoring related technologies like Azure Monitor, Log Analytics, Resource Graph, Azure Alerts, Network Watcher, Grafana, Ambari, Prometheus, Datadog, Confluent, etc. Experience in deploying, configuring, and operating enterprise Monitoring solutions. Experience in one or more automation languages (PowerShell, Python, C#, Open Source). Communication skills: ability to empathize with customers and convey confidence. Able to explain highly technical issues to varied audiences. Able to prioritize and advocate customer’s needs to the proper channels. Take ownership and work towards a resolution. Customer Obsession: Passion for customers and focus on delivering the right customer experience. Growth Mindset: Openness and ability to learn new skills and technologies in a fast-paced environment. The ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Responsibilities Technically Oriented With minimal oversight, track customer incidents, engage with strategic customers and partners to understand issues, contribute to troubleshooting through diagnostics, communicate progress and next steps to customers with a focus on reducing time taken to mitigate critical incidents. Use engineering and support tools, customer telemetry and/or direct customer input to detect and flag issues in the products or with the customer usage of the products. Help customers stay current with best practices by sharing content. Identify and leverage developmental opportunities across product areas and business processes (e.g., mentorships, shadowing, trainings) for professional growth and to develop technical skills to resolve customer issues. Customer Solution Lifecycle Management With minimal guidance, serve as a connecting point between the product team and customers throughout the engagement life cycle, engage with customers to understand their business and availability needs, develop and offer proactive guidance on designing configurations and deploying solutions on Azure with support from subject matter experts. Handle critical escalations on customer issues from the customer or support or field teams, conduct impact analysis, help customers with answers to their technical questions, and serve as an escalation resource in areas of subject matter expertise. Conduct in-depth root cause analysis of issues and translates findings into opportunities for improvement and track and drive them as repair items. Relationship/Experience Management Act as the voice of customers and channel product feedback from strategic customers to product groups. Identify customer usage patterns and drive resolutions on reoccurring issues with product groups. Close the feedback loop with the customers on product features. With minimal guidance, partner with other teams (e.g., program managers, software engineers, product, customer service support teams), prioritize, unblock, and resolve critical customer issues. Collaborate with stakeholders to support delivery of solutions to strategic customers and resolving customer issues. Embody our culture and values. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 week ago
2.0 - 10.0 years
0 Lacs
karnataka
On-site
As an AI & ML Instructor at ADaSci, you will be responsible for delivering comprehensive corporate training sessions to experienced professionals. Your deep expertise in artificial intelligence (AI), machine learning (ML), cloud platforms (Azure, AWS, GCP), data engineering, and MLOps will enable you to design engaging and informative training sessions that empower professionals to excel in their field. Your key responsibilities will include developing and delivering training sessions on AI, ML, and related topics, designing hands-on labs and practical exercises to enhance learning, customizing training materials to meet the specific needs of corporate clients, providing real-world examples and case studies to illustrate concepts, evaluating participants" progress, and staying updated on the latest trends in AI, ML, cloud computing, data engineering, and MLOps. To excel in this role, you should possess a graduate degree with 10+ years of experience, postgraduate degree with 5+ years of experience, or a PhD with 2+ years of experience. You should have proven experience as an instructor or trainer in AI, ML, data engineering, or related fields, deep knowledge of AI and ML algorithms, proficiency in cloud platforms, experience with MLOps, strong understanding of data engineering principles, and familiarity with system design and architecture. Preferred skills include hands-on experience with AI/ML frameworks (e.g., TensorFlow, PyTorch, scikit-learn), knowledge of containerization technologies like Docker, experience with big data technologies (e.g., Hadoop, Spark), and previous experience in corporate training or educational settings. Strong communication, presentation, problem-solving skills, and the ability to adapt training content for diverse audiences are essential for success in this role.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
delhi
On-site
As a Principal Data Engineer, you will be responsible for leading our data engineering efforts by designing and building robust data pipelines, optimizing data architecture, and collaborating with cross-functional teams to drive data-driven decision-making. Your main responsibilities will include designing, building, and maintaining scalable data pipelines for large datasets, collaborating with data scientists and analysts to understand data requirements and ensure data quality, implementing data modeling techniques, and maintaining data architecture best practices. You will also be optimizing data storage and retrieval processes to improve performance and efficiency, monitoring data flow, troubleshooting data-related issues in a timely manner, and documenting data engineering processes while maintaining data dictionaries. To be successful in this role, you should have 6-8 years of experience in data engineering or a related field. You should be proficient in programming languages such as Python, Java, or Scala, have strong experience with SQL and NoSQL databases (e.g., PostgreSQL, MongoDB), and hands-on experience with big data technologies like Hadoop, Spark, or Kafka. Additionally, familiarity with cloud platforms such as AWS, Azure, or Google Cloud, knowledge of data warehousing solutions and ETL processes, experience with data modeling and data architecture principles, strong problem-solving skills, and the ability to work in a fast-paced environment are essential. If you are passionate about data engineering and have the skills and qualifications mentioned above, we encourage you to apply for this exciting opportunity to lead our data engineering efforts and make a significant impact on our data-driven decision-making processes.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Senior Data Engineer at our organization, you will play a crucial role in the Data Engineering team within the Enterprise Data & Analytics organization. Your primary responsibility will be to design, build, and maintain both batch and real-time data pipelines that cater to the needs of our enterprise, analyst communities, and downstream systems. Collaboration with data architects is essential to ensure that the data engineering solutions align with the long-term architecture objectives. You will be tasked with maintaining and optimizing the data infrastructure to facilitate accurate data extraction, transformation, and loading from diverse data sources. Developing ETL processes will be a key part of your role to extract and manipulate data effectively. Ensuring data accuracy, integrity, privacy, security, and compliance will be a top priority, and you will need to follow quality control procedures and adhere to SOX compliance standards. Monitoring data systems performance, implementing optimization strategies, improving operational practices and metrics, and mentoring junior engineers will also be part of your responsibilities. To be successful in this role, you should possess a Bachelor's degree in Computer Science, Information Systems, or a related field, along with a minimum of 5+ years of relevant experience in data engineering. Experience with cloud Data Warehouse solutions (such as Snowflake) and Cloud-based solutions (e.g., AWS, Azure, GCP), as well as exposure to Salesforce or any CRM system, will be beneficial. Proficiency in Advanced SQL, relational databases, database design, large data sets, distributed computing (Spark/Hive/Hadoop), object-oriented languages (Python, Java), scripting languages, data pipeline tools (Airflow), and agile methodology is required. Your problem-solving, communication, organizational skills, ability to work independently and collaboratively, self-starting attitude, stakeholder communication skills, and quick learning and adaptability will be crucial for excelling in this role. By following best practices, standards, and contributing to the maturity of data engineering practices, you will be instrumental in driving business transformation through data.,
Posted 1 week ago
175.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you’ll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express Smart Monitoring is an industry-leading and an award-winning Risk Monitoring/Control Testing platform owned and managed by the Global Risk Compliance and it leverages high technology, automation, and data science to detect, predict and prevent risks. Its patent-pending approach uniquely combines advances in data science and technology (AI, machine learning, cloud computing) to transform risk management. The Smart Monitoring Center of Excellence is a comprised of group of experts that leverage the Smart Monitoring platform to build and manage Key Risk Indicators (KRIs) and Automated Control Tests (ACTs) that monitor risks and detect control failure across AXP, supporting Business Units and Staff Groups, Product Lines and Processes. Smart Monitoring Center of Excellence team supports the businesses with a mission to enable business growth and objectives while maintaining a strong control environment. We are seeking a Data Scientist to join this exciting opportunity to grow Smart Monitoring COE multi-folds. As a member of SM COE, the incumbent will be responsible for identifying opportunities to apply new and innovative ways to monitor risks through KRIs/ACTs and execute appropriate strategies in partnership with Business, OE, Compliance, and other stakeholder teams. Key activities for the role will include: Lead the design and implementation of NLP & GenAI based solutions for real time identification of Key Risk Indicators. Owning the architecture and roadmap of the models and tools from ideation to productionizing Lead a team of data scientists, providing mentorship, performance coaching and technical guidance to build domain depth and deliver excellence Champion governance, interpretability of models from validation point of view Lead R&D efforts to leverage external data (social forums, etc.) to generate insights for operational/compliance risks Provide rigorous analytics solutions to support critical business functions and support machine learning solutions prototyping Collaborate with Model consumers, data Engineers, and all related stakeholders to ensure precise implementation of solutions Qualifications: · Masters/PhD in a quantitative field (Computer Science, Statistics, Mathematics, Operation Research, etc.) with hands-on experience leveraging sophisticated analytical and machine learning techniques. Strong preference for candidates with 5-6+ years of working experience driving business results · Demonstrated ability to frame business problems into machine learning problems, leverage external thinking and tools (from academia and/or other industries) to engineer a solvable solution to deliver business insights and optimal control policy · Creativity to go beyond the status-quo to construct and deliver the best solution to the problem, ability and comfort with working independently and making key decisions on projects · Deep understanding of machine learning/statistical algorithms such as time series analysis and outlier detection, neural networks/deep learning, boosting and reinforcement learning. Experience with data visualization a plus · Expertise in an analytical language (Python, R, or the equivalent), and experience with databases (GCP, SQL, or the equivalent) · Prior experience working with Big Data tools and platforms (Hadoop, Spark, or the equivalent) · Experience in building NLP solutions and/or GEN AI are strongly preferred · Self-motivated with the ability to operate independently and handle multiple workstreams and ad-hoc tasks simultaneously. · Team player with strong relationship building, management and influencing skills · Strong verbal and written communication skills American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. American Express has embarked on an exciting transformation driven by an energetic new team of an inclusive pool of candidates to give all an equal opportunity for growth. Service Operations is responsible for providing reliable platforms for hundreds of critical applications and utilities within American Express Primary focus is to provide technical expertise and tooling to ensure the highest level of reliability and availability for critical applications. Able to provide consultation and strategic recommendations by quickly assessing and remediating complex availability issues. Responsible for driving automation, efficiencies to increase quality, availability, and auto-healing of complex processes. Responsibilities include, but not limited to: The Ideal candidate will be responsible for Designing, Developing and maintaining data pipelines. Serving as a core member of an agile team that drives user story analysis and elaboration, designs and develops responsive web applications using the best engineering practices You will closely work with data scientists, analysts and other partners to ensure the flawless flow of data. You will be Building and optimize reports for analytical and business purpose. Monitor and solve data pipelines issues to ensure smooth operation. Implementing data quality checks and validation process to ensure the accuracy completeness and consistency of data Implementing data governance policies , access controls , and security measures to protect critical data and ensure compliance. Developing deep understanding of integrations with other systems and platforms within the supported domains. Bring a culture of innovation, ideas, and continuous improvement. Challenging status quo, demonstrate risk taking, and implement creative ideas Lead your own time, and work well both independently and as part of a team. Adopt emerging standards while promoting best practices and consistent framework usage. Work with Product Owners to define requirements for new features and plan increments of work. Minimum Qualifications BS or MS degree in computer science, computer engineering, or other technical subject area or equivalent 0 to 3 years of work experience At least 1 to 3 years of hands-on experience with SQL, including schema design, query optimization and performance tuning. Experience with distributed computing frameworks like Hadoop,Hive,Spark for processing large scale data sets. Proficiency in any of the programming language python, pyspark for building data pipeline and automation scripts. Understanding of cloud computing and exposure to Big Query and Airflow to execute DAGs. knowledge of CICD, GIT commands and deployment process. Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues and optimize data processing workflows Excellent communication and collaboration skills. We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a candidate with over 5 years of experience, you will be responsible for Bigdata Testing involving technologies such as Hadoop, HDFS, Hive, Kafka, Spark, SQL, and UNIX. Your expertise in these areas will be crucial in ensuring the efficiency and accuracy of our data testing processes. Your mandatory skills should include proficiency in Bigdata Testing tools such as Hadoop, HDFS, Hive, Kafka, Spark, SQL, and UNIX. Additionally, having a good understanding of these technologies will be advantageous in fulfilling your responsibilities effectively. While working at our Bangalore location, you will be required to undergo a background check process either before onboarding or after onboarding. This process will be facilitated by a designated BGV Agency to ensure compliance and security within the organization. Overall, your role will be pivotal in maintaining the quality and reliability of our data testing procedures, making your expertise in Bigdata Testing technologies essential to our team's success.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
Genpact is a global professional services and solutions firm with a team of over 125,000 professionals in more than 30 countries. Driven by curiosity, agility, and the desire to create lasting value for clients, we serve leading enterprises worldwide, including the Fortune Global 500. Our purpose is the relentless pursuit of a world that works better for people, and we achieve this through our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant, Research Data Scientist. We are looking for candidates with relevant experience in Text Mining/Natural Language Processing (NLP) tools, Data sciences, Big Data, and algorithms. The ideal candidate should have full cycle experience in at least one large-scale Text Mining/NLP project, including creating a business use case, Text Analytics assessment/roadmap, technology & analytic solutioning, implementation, and change management. Experience in Hadoop, including development in the map-reduce framework, is also desirable. The Text Mining Scientist (TMS) will play a crucial role in bridging enterprise database teams and business/functional resources, translating business needs into techno-analytic problems, and working with database teams to deliver large-scale text analytic solutions. Responsibilities: - Develop transformative AI/ML solutions to address clients" business requirements - Manage project delivery involving data pre-processing, model training and evaluation, and parameter tuning - Manage stakeholder/customer expectations and project documentation - Research cutting-edge developments in AI/ML with NLP/NLU applications in various industries - Design and develop solution algorithms within tight timelines - Interact with clients to collect and synthesize requirements for effective analytics/text mining roadmap - Work with digital development teams to integrate algorithms into production applications - Conduct applied research on text analytics and machine learning projects, file patents, and publish papers Qualifications: Minimum Qualifications/Skills: - MS in Computer Science, Information Systems, or Computer Engineering - Relevant experience in Text Mining/Natural Language Processing (NLP) tools, Data sciences, Big Data, and algorithms Technology: - Open Source Text Mining paradigms (NLTK, OpenNLP, OpenCalais, StanfordNLP, GATE, UIMA, Lucene) and cloud-based NLU tools (DialogFlow, MS LUIS) - Statistical Toolkits (R, Weka, S-Plus, Matlab, SAS-Text Miner) - Strong Core Java experience, programming in the Hadoop ecosystem, and distributed computing concepts - Proficiency in Python/R programming; Java programming skills are a plus Methodology: - Solutioning & Consulting experience in verticals like BFSI, CPG, with text analytics experience on large structured and unstructured data - Knowledge of AI Methodologies (ML, DL, NLP, Neural Networks, Information Retrieval, NLG, NLU) - Familiarity with Natural Language Processing & Statistics concepts, especially in their application - Ability to conduct client research to enhance analytics agenda Preferred Qualifications/Skills: Technology: - Expertise in NLP, NLU, and Machine learning/Deep learning methods - UI development paradigms for Text Mining Insights Visualization - Experience with Linux, Windows, GPU, Spark, Scala, and deep learning frameworks Methodology: - Social Network modeling paradigms, tools & techniques - Text Analytics using NLP tools like Support Vector Machines and Social Network Analysis - Previous experience with Text analytics implementations using open source packages or SAS-Text Miner - Strong prioritization, consultative mindset, and time management skills Job Details: - Job Title: Principal Consultant - Primary Location: India-Gurugram - Schedule: Full-time - Education Level: Master's/Equivalent - Job Posting Date: Oct 4, 2024, 12:27:03 PM - Unposting Date: Ongoing - Master Skills List: Digital - Job Category: Full Time,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a skilled and seasoned Senior Data Engineer to become a part of our innovative team. The perfect candidate will possess a solid foundation in data engineering and proficiency in Azure, Azure Data Factory (ADF), Azure Fabric, Databricks, and Snowflake. This position necessitates the creation, development, and upkeep of data pipelines, ensuring data quality and accessibility, and collaborating with various teams to support our data-centric initiatives. Your responsibilities will include designing, developing, and maintaining robust data pipelines utilizing Azure Data Factory, Azure Fabric, Databricks, and Snowflake. You will work closely with data scientists, analysts, and stakeholders to comprehend data requirements and guarantee the availability and quality of data. Implementing and refining ETL processes to handle the ingestion, transformation, and loading of data from diverse sources into data warehouses, data lakes, and Snowflake will also be a key aspect of your role. Additionally, you will be responsible for upholding data integrity and security through the implementation of best practices and compliance with data governance policies. Monitoring and resolving data pipeline issues to ensure the timely and accurate delivery of data, as well as enhancing data storage and retrieval processes to boost performance and scalability, will be essential tasks. It is crucial to stay abreast of industry trends and best practices in data engineering and cloud technologies. Furthermore, you will have the opportunity to mentor and provide guidance to junior data engineers, offering technical expertise and assistance as required. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field, along with over 5 years of experience in data engineering, with a strong emphasis on Azure, ADF, Azure Fabric, Databricks, and Snowflake. Proficiency in SQL, experience in data modeling and database design, and strong programming skills in Python, Scala, or Java are also essential. Familiarity with big data technologies like Apache Spark, Hadoop, and Kafka, as well as a solid grasp of data warehousing concepts and experience with data warehousing solutions (e.g., Azure Synapse Analytics, Snowflake) is required. Knowledge of data governance, data quality, and data security best practices, excellent problem-solving abilities, and effective communication and collaboration skills within a team setting are all highly valued. Preferred qualifications include experience with other Azure services such as Azure Blob Storage, Azure SQL Database, and Azure Cosmos DB, familiarity with DevOps practices and tools for CI/CD in data engineering, as well as certifications in Azure Data Engineering, Snowflake, or related fields.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Senior Data Engineer at Buildnetic, a Singapore HQ company located in Bangalore, you will leverage your 8 to 12 years of experience to play a crucial role in designing, implementing, and managing data infrastructure that drives data-driven decision-making processes. In this hybrid role, you will collaborate with cutting-edge technologies to construct data pipelines, architect data models, and uphold data integrity. Your key responsibilities will include designing, developing, and maintaining scalable data pipelines and architectures, working with large datasets to create efficient ETL processes, and partnering with data scientists, analysts, and stakeholders to discern business requirements. Ensuring data quality through cleaning, validation, and profiling, implementing data models for optimal performance in data warehousing and data lakes, and managing cloud data infrastructure on platforms like AWS, Azure, or GCP will be essential aspects of your role. You will work with a variety of programming languages including Python, SQL, Java, and Scala, alongside data warehousing and data lakes tools such as Snowflake, Redshift, Databricks, Hadoop, Hive, and Spark. Your expertise in data modeling techniques, ETL tools like Informatica and Talend, and management of both NoSQL and relational databases will be critical. Additionally, experience with CI/CD pipelines, Git for version control, troubleshooting complex data infrastructure issues, and proficiency in Linux/Unix systems will be advantageous. If you possess strong problem-solving skills, effective communication abilities, and prior experience working in a hybrid work environment, Buildnetic offers you an opportunity to be part of a forward-thinking company that prioritizes innovation and technological advancement. You will collaborate with a talented and collaborative team, enjoy a flexible hybrid working model, and receive a competitive salary and benefits package. If you are passionate about data engineering and eager to work with the latest technologies, we look forward to hearing from you.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
delhi
On-site
As a Senior Specialist in Robotic Process Automation (RPA) within the Finance Data and Automation team at Agoda, you will play a crucial role in delivering key automation initiatives for various Finance stakeholders. Reporting to the Manager (RPA), you will have the opportunity to contribute to AI projects as well. Your responsibilities will involve collaborating closely with Finance users and Subject Matter Experts (SMEs) to understand and design high-quality automations, taking ownership of initiatives end-to-end. A successful candidate in this position will be a trusted partner to the business, providing automation solutions by re-engineering processes. This role is based in Bangkok with relocation provided for expat candidates. In this role, you will have the chance to work closely with stakeholders from different verticals in Finance, consulting them to propose the most suitable automation solutions. You will be responsible for building, managing, and optimizing end-to-end automations while ensuring security and compliance aspects are met. Additionally, you will become familiar with the finance ecosystem at Agoda and work with Gen AI technology to deliver efficiency-saving automation projects. To succeed in this role, you should have an Undergraduate/Postgraduate degree and at least 8 years of experience with RPA tools, process mining tools, and OCR. While 10+ years of experience is ideal, it is not mandatory. Mandatory tools include Power Automate, Celonis, and Rossum, with desirable skills in Blue Prism, Alteryx, Blue Prism Process Intelligence (BPPI), and Interact. You should have demonstrated experience in the end-to-end delivery of at least 5 processes using Power Automate Desktop, as well as full-stack development expertise with frameworks like Django, React JS, Node JS, and Bootstrap. Proficiency in advanced programming and Gen AI implementation using JavaScript, Python, and vector embedding databases is essential. A high sense of ownership, growth mindset, and ability to be self-directed are key qualities for success in this role. Moreover, you should establish robust RPA governance and maintain strong control over bots to ensure a high Bot utilization factor. Excellent communication skills and the ability to influence peers and build strong relationships within Finance and cross-functionally are important. Advanced Excel skills are also required. It would be advantageous if you have accounting/financial knowledge, commercial acumen, experience in full-stack development (UI/UX, API dev, backend, Postgres DB), familiarity with scrum/agile methodology, and skills in Hadoop, Celonis Certification, and Power Automate Desktop Certification. Agoda is an Equal Opportunity Employer. Your application will be kept on file for future vacancies, and you can request to have your details removed at any time. Agoda does not accept third-party resumes, and unsolicited resumes will not incur any fees. For more information, please refer to our privacy policy.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
Genpact is a global professional services and solutions firm committed to delivering outcomes that shape the future. With over 125,000 employees in more than 30 countries, we are fueled by curiosity, agility, and a drive to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, and we serve leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are looking for a Senior Principal Consultant, Senior Data Architect to join our team! Responsibilities: - Manage programs and ensure the integration and implementation of program elements according to the agreed schedule with quality deliverables. - Lead and guide the development team on ETL architecture. - Collaborate closely with customer architects, business, and technical stakeholders to build trust and establish credibility. - Provide insights on customer direction to guide them towards optimal outcomes. - Address client technical issues, articulate understanding, and offer solutions in AWS Cloud and Big Data Domains. - Build the infrastructure for efficient extraction, transformation, and loading of data from various sources using SQL and AWS "Big data" technologies. - Analyze the existing technology landscape and current application workloads. - Design and architect solutions with scalability, operational completion, and elasticity in mind. - Hands-on experience in building Java applications. - Optimize Spark applications running on Hadoop EMR clusters for performance. - Develop architecture blueprints, detailed documentation, and bill of materials, including required Cloud Services. - Collaborate with various teams to drive business growth and customer success. - Lead strategic pre-sales engagements with larger, more complex customers. - Engage and communicate proactively to align internal and external customer expectations. - Drive key strategic opportunities with top customers in partnership with sales and delivery teams. - Maintain customer relationships through proactive pre-sales engagements. - Lead workshops to identify customer needs and challenges. - Create and present services responses, proposals, and roadmaps to meet customer objectives. - Lead presales, solutioning, estimations, and POC preparation. - Mentor team members and build reusable solution frameworks and components. - Head complex ETL requirements, design, and implementation. - Ensure client satisfaction with product by developing architectural requirements. - Develop project plans, identify resource requirements, and assure code quality. - Shape and enhance ETL architecture, recommend improvements, and resolve design issues. Qualifications: Minimum qualifications: - Engineering Degree or equivalent. - Relevant work experience. - Hands-on experience in ETL/BI tools like Talend, SSIS, Abinitio, Informatica. - Experience with Cloud Technologies such as AWS, Databricks, Airflow. Preferred Skills: - Excellent written and verbal communication skills. - Strong analytical and problem-solving skills. - Experience in consulting roles within a technology company. - Ability to articulate technical solutions clearly to different stakeholders. - Team player with the ability to collaborate effectively. - Willingness to travel occasionally. If you are looking to join a dynamic team and contribute to innovative solutions in data architecture, this role might be the perfect fit for you. Join us at Genpact and be part of shaping the future of professional services and solutions! Job Details: - Job Title: Senior Principal Consultant, Senior Data Architect - Primary Location: India-Mumbai - Schedule: Full-time - Education Level: Bachelor's / Graduation / Equivalent - Job Posting Date: Oct 7, 2024, 7:51:53 AM - Master Skills List: Digital - Job Category: Full Time,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
punjab
On-site
Are you a skilled AI/ML engineer with 4-6 years of experience seeking a high-impact opportunity to lead the development of AI-driven solutions for a cutting-edge ERP platform If so, we have the perfect role for you! We are looking for a Senior AI/ML Engineer to join our dynamic team in Mohali, Punjab. In this role, you will be responsible for architecting and implementing AI solutions that optimize ERP processes such as sales forecasting, inventory management, and customer behavior analysis. You will have the chance to work on exciting projects in predictive analytics, automation, and advanced data insights. Key Responsibilities: - Architect and implement AI/ML models to optimize ERP processes - Develop and maintain scalable data pipelines for structured data flow - Collaborate with cross-functional teams to design and deploy AI solutions - Deploy AI models into production and monitor performance - Stay up-to-date with the latest AI/ML technologies and integrate them into the ERP platform - Troubleshoot and fine-tune AI models for improved outcomes - Provide technical expertise and support for project execution Requirements: - 4-6 years of AI/ML development experience - Proficiency in AI/ML frameworks like TensorFlow, PyTorch, or ML.NET - Hands-on experience with C#/.NET, SQL Server, and building AI models for enterprise applications - Experience in managing data pipelines using SQL, Spark, or Hadoop - Strong problem-solving and analytical skills - Excellent communication skills for effective collaboration Education: - Bachelor's or Master's degree in Computer Science, Data Science, Artificial Intelligence, or related field Join our fast-growing, innovative team and be part of shaping the future of ERP with advanced AI initiatives. Apply today with your updated resume and seize the opportunity to make a significant impact in the industry!,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
The Intermediate Analyst in Specialized Analytics is a developing professional role with the ability to independently solve most problems and address complex issues. By combining specialized knowledge with industry standards, you will contribute to achieving sub function objectives. Your role involves applying analytical thinking, utilizing data analysis tools, and maintaining attention to detail when making recommendations based on factual information. You will play a key role in interpreting data, breaking down information systematically, and communicating effectively. Your strong communication and diplomacy skills will be essential for exchanging complex information and collaborating closely with core business activities. The quality and timeliness of your service will directly impact the effectiveness of your team and related teams. Responsibilities: - Work with large and complex data sets to evaluate, recommend, and support business strategies - Identify and compile data sets using tools such as SQL and Access to predict, improve, and measure key business outcomes - Document data requirements, collection, processing, cleaning, and exploratory analysis - Specialize in marketing, risk, digital, and AML fields - Evaluate risks in business decisions to safeguard Citigroup and ensure compliance with laws and regulations Qualifications: - 2-5 years of relevant experience - Proficiency in data retrieval and manipulation - Strong analytic and problem-solving skills - Experience in a quantitative field - Willingness to learn with a can-do attitude - Excellent communication and interpersonal skills - Ability to build partnerships with cross-functional leaders Education: - Bachelor's/University degree or equivalent experience This job description offers an overview of the role's responsibilities, with the possibility of additional duties as needed. Desired Skills (Good to have): - Marketing analytics experience - Familiarity with digital marketing and/or digital experience domains - Experience with Clickstream data and big data environments like Hadoop - Predictive modeling using Machine Learning techniques - Customer Journey analytics experience - Proficiency in Python, SQL, MS Excel, and PowerPoint - Exposure to journey analytics tools like ClickFox, BryterCX, or Pointillist - Experience in Hive Citigroup Inc. is an equal opportunity employer, providing career opportunities for qualified applicants. If you require accommodation due to a disability, please review the Accessibility at Citi guidelines.,
Posted 1 week ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a highly experienced Java technical architect to join our data architecture team in Hyderabad. In this pivotal role, you will lead the design and implementation of the technical architecture for our data solutions. Your responsibilities will include driving the design and implementation of highly scalable, fault-tolerant data solutions, guaranteeing that all systems meet functional and non-functional requirements, integrating complex systems seamlessly to optimize data flow, and prioritizing the scalability and resiliency of technical implementations for long-term performance. Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven track record (10-17 years) of extensive experience in enterprise-level software development with a strong focus on Java and its frameworks (Spring, Hibernate). - Proficiency in cloud technologies (AWS, Azure) and containerization (Kubernetes, Docker). - Experience working with distributed database technologies, including NoSQL, SQL, and Hadoop. - Ability to design and implement highly scalable, fault-tolerant systems. - Strong understanding of RESTful web services, microservices, message queues/streams, and API development. - Excellent communication, leadership, and problem-solving skills with the ability to build strong technical relationships across teams. - Proven ability to work effectively within an Agile/Scrum development methodology. Requirements: - Mastery of Java 8 or higher. - Spring Boot expertise. - In-depth knowledge of Hibernate. - Proficiency in SQL databases. - Solid understanding of RESTful services. - Experience with unit testing methodologies. - Familiarity with Agile and Scrum development practices. - Microservices architecture experience. - Hands-on mentality with a willingness to contribute to development efforts when necessary. Good to have: - Experience with AWS cloud technologies (e.g., EC2, S3, RDS, Lambda, CloudFormation). If you are a Java technical architect with a passion for designing and implementing cutting-edge data solutions, we would love to have you on our team. Join us in Hyderabad and be a part of our fast-growing data & analytics company.,
Posted 1 week ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
As the Director of Data Engineering, you will play a strategic leadership role in overseeing the architecture, development, and maintenance of our company's data infrastructure. Your responsibilities will include leading a team of data engineers to design, build, and scale data systems and processes to ensure data quality, accessibility, and reliability. Collaboration with data scientists, analysts, and other stakeholders will be crucial to drive data-driven decision-making across the organization. You will lead and manage a team of 50+ members, including architects and engineers, to ensure high performance and engagement. Designing and implementing end-to-end Azure solutions, maintaining data architectures, and collaborating with stakeholders to translate business requirements into scalable cloud solutions are key aspects of your role. Your responsibilities will also involve overseeing the development and deployment of data solutions using Azure services such as ADF, Event Hubs, Stream Analytics, Synapse Analytics, Azure Data Bricks, Azure SQL Database, and Azure DevOps. Ensuring data governance, security, and compliance across all data solutions, collaborating with various team members, and driving continuous improvement and innovation within the engineering team are essential parts of your role. In terms of client account management, you will build and maintain strong client relationships by understanding their unique needs and challenges. Acting as the main point of contact for clients, developing account plans, and identifying growth opportunities are also part of your responsibilities. To be successful in this role, you should have a minimum of 15+ years of experience in data engineering or related roles, including at least 5 years in a leadership position. A degree in Computer Science, Information Technology, Data Science, or a related field is required. Key technical skills for this role include expertise in Cloud/Data Solution Design, strong experience with Azure Cloud technologies, proficiency in data engineering technologies and tools, programming experience in Java, Python, PySpark, and knowledge of data governance, security, and compliance standards. Leadership skills such as leading high-performing teams, project management, communication, and interpersonal skills are also essential. Your competencies should include strategic thinking, problem-solving skills, the ability to work in a fast-paced environment, strong organizational skills, and a drive for innovation and continuous improvement.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The Data Analytics Intmd Analyst position is ideal for a developing professional who can independently handle most problems and has the flexibility to solve complex issues. By merging in-depth specialty area knowledge with a strong understanding of industry standards and practices, you will contribute towards achieving the objectives of the subfunction/job family. Your role will involve applying analytical thinking and utilizing data analysis tools and methodologies to make informed judgments and recommendations based on factual information. Your responsibilities will include integrating in-depth data analysis knowledge with industry standards, understanding how data analytics teams collaborate with others to achieve objectives, applying project management skills, and utilizing analytical thinking for accurate judgments and recommendations. You will also be expected to break down information systematically, communicate effectively, and ensure the quality and timeliness of service provided by the team. As an Intmd Analyst, you will be required to provide informal guidance or on-the-job training to new team members, assess risks when making business decisions, and prioritize the firm's reputation and compliance with laws and regulations. Your expertise in Hadoop, Python, Spark, Hive, RDBMS, and Scala, along with knowledge of statistical modeling tools for large data sets, will be crucial for success in this role. To qualify for this position, you should have at least 5 years of relevant experience, strong expertise in the mentioned technologies, and the ability to effectively use complex analytical, interpretive, and problem-solving techniques. Excellent interpersonal, verbal, and written communication skills are also essential. A Bachelor's/University degree or equivalent experience is required for this role. This job description provides an overview of the primary duties involved, and additional responsibilities may be assigned as needed. Citi is an equal opportunity and affirmative action employer, offering a full-time position in the Technology job family group, specifically in Data Analytics. If you are a person with a disability and require accommodations to apply for a career opportunity at Citi, please review the Accessibility at Citi guidelines.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
We are seeking a confident, creative, and curious team player to join us as a Business Analyst in Bangalore, working with one of the largest retail companies in the UK. As part of the team, you will support multiple projects for our client in an Agile environment. Your responsibilities will include working closely with Product Managers to deliver outcomes for the product area, owning source system discovery and data analysis for accuracy and completeness, collaborating with other technology teams to understand tech initiatives, and gathering and documenting information on systems. You will also build domain knowledge to understand business requirements, document metadata for new and existing data, assist in maintaining backlogs and user stories, validate data to ensure it meets business requirements, and support business teams during UAT and post-production phases. Additionally, you will support the Engineering team in production landscape design, document all aspects of data products, and perform data lineage analysis. The ideal candidate will have at least 8 years of relevant Business Analyst experience, with expertise in Big Data Hadoop applications, SQL queries, and Agile methodology. Proficiency in English at the C1 Advanced level is required. If you are a proactive individual with strong analytical skills, experience in business analysis, and a passion for working in a dynamic team environment, we would like to hear from you.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III - Java Full Stack Developer + React + AWS at JPMorgan Chase within the Commercial & Investment Bank team, you'll serve as a member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You'll be responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Execute software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Create secure and high-quality production code and maintain algorithms that run synchronously with appropriate systems. Create architectural and design documentation for complex applications, ensuring that the software code development adheres to the specified design constraints. Gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identify hidden problems and patterns in data and use these insights to drive improvements to coding hygiene and system architecture. Contribute to software engineering communities of practice and events that explore new and emerging technologies. Adds to the team culture of diversity, equality, inclusion, and respect. Required qualifications, capabilities, and skills include formal training or certification on software engineering concepts and 3+ years of proficient applied experience. Hands-on practical experience in system design, application development, testing, and operational stability. Strong experience in Java latest versions, Spring Boot and Spring Framework, JDBC, JUnit. Experience in RDBMS and NOSQL databases. Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.). Proficiency in Java/J2EE and REST APIs, Web Services, and experience in building event-driven Micro Services. Experience in developing UI applications using React and Angular. Working proficiency in developmental toolsets like GIT/Bit Bucket, JIRA, Maven. Proficiency in automation and continuous delivery methods. Strong analytical skills and problem-solving ability. Working knowledge of AWS & certification is a must. Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security. Design, deploy, and manage AWS cloud infrastructure using services such as EC2, S3, RDS, Kubernetes, Terraform, Lambda, and VPC. Working knowledge of AWS Glue, AWS Athena & AWS S3. Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Hands-on practical experience delivering system design, application development, testing, and operational stability. Collaborate with development teams to create scalable, reliable, and secure cloud architectures. Preferred qualifications, capabilities, and skills include exposure to the latest Python Libraries. Knowledge with AWS Lake Formation. Familiarity with modern front-end technologies. Experience in big data technologies: Hadoop. Experience on Caching Solutions: Hazelcast and Redis.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Genpact is a global professional services and solutions firm with over 125,000 employees in more than 30 countries. We are driven by curiosity, entrepreneurial agility, and the desire to create lasting value for our clients, including Fortune Global 500 companies. Our purpose is the relentless pursuit of a world that works better for people, and we serve leading enterprises with deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the role of Senior Principal Consultant, Research Data Scientist. The ideal candidate should have experience in Text Mining, Natural Language Processing (NLP) tools, Data sciences, Big Data, and algorithms. It is desirable to have full-cycle experience in at least one Large Scale Text Mining/NLP project, including creating a business use case, Text Analytics assessment/roadmap, Technology & Analytic Solutioning, Implementation, and Change Management. Experience in Hadoop, including development in the map-reduce framework, is also required. The Text Mining Scientist (TMS) will play a crucial role in bridging enterprise database teams and business/functional resources, translating business needs into techno-analytic problems and working with database teams to deliver large-scale text analytic solutions. The right candidate should have prior experience in developing text mining and NLP solutions using open-source tools. Responsibilities include developing transformative AI/ML solutions, managing project delivery, stakeholder/customer expectations, project documentation, project planning, and staying updated on industrial and academic developments in AI/ML with NLP/NLU applications. The role also involves conceptualizing, designing, building, and developing solution algorithms, interacting with clients to collect requirements, and conducting applied research on text analytics and machine learning projects. Qualifications we seek: Minimum Qualifications/Skills: - MS in Computer Science, Information systems, or Computer engineering - Systems Engineering experience with Text Mining/NLP tools, Data sciences, Big Data, and algorithms Technology: - Proficiency in Open Source Text Mining paradigms like NLTK, OpenNLP, OpenCalais, StanfordNLP, GATE, UIMA, Lucene, and cloud-based NLU tools such as DialogFlow, MS LUIS - Exposure to Statistical Toolkits like R, Weka, S-Plus, Matlab, SAS-Text Miner - Strong Core Java experience, Hadoop ecosystem, Python/R programming skills Methodology: - Solutioning & Consulting experience in verticals like BFSI, CPG - Solid foundation in AI Methodologies like ML, DL, NLP, Neural Networks - Understanding of NLP & Statistics concepts, applications like Sentiment Analysis, NLP, etc. Preferred Qualifications/Skills: Technology: - Expertise in NLP, NLU, Machine learning/Deep learning methods - UI development paradigms, Linux, Windows, GPU Experience, Spark, Scala - Deep learning frameworks like TensorFlow, Keras, Torch, Theano Methodology: - Social Network modeling paradigms - Text Analytics using NLP tools, Text analytics implementations This is a full-time position based in India-Noida. The candidate should have a Master's degree or equivalent education level. The job posting was on Oct 7, 2024, and the unposting date is ongoing. The primary skills required are digital, and the job category is full-time.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As an integral part of our Data Automation & Transformation team, you will experience unique challenges every day. We are looking for someone with a positive attitude, entrepreneurial spirit, and a willingness to dive in and get things done. This role is crucial to the team and will provide exposure to various aspects of managing a banking office. In this role, you will focus on building curated Data Products and modernizing data by moving it to SNOWFLAKE. Your responsibilities will include working with Cloud Databases such as AWS and SNOWFLAKE, along with coding languages like SQL, Python, and Pyspark. You will analyze data patterns across large multi-platform ecosystems and develop automation solutions, analytics frameworks, and data consumption architectures utilized by Decision Sciences, Product Strategy, Finance, Risk, and Modeling teams. Ideally, you should have a strong analytical and technical background in financial services, particularly in small business banking or commercial banking segments. Your key responsibilities will involve migrating Private Client Office Data to Public Cloud (AWS and Snowflake), collaborating closely with the Executive Director of Automation and Transformation on new projects, and partnering with various teams to support data analytics needs. You will also be responsible for developing data models, automating data assets, identifying technology gaps, and supporting data integration projects with external providers. To qualify for this role, you should have at least 3 years of experience in analytics, business intelligence, data warehousing, or data governance. A Master's or Bachelor's degree in a related field (e.g., Data Analytics, Computer Science, Math/Statistics, or Engineering) is preferred. You must have a solid understanding of programming languages such as SQL, SAS, Python, Spark, Java, or Scala, and experience in building relational data models across different technology platforms. Excellent communication, time management, and multitasking skills are essential for this role, along with experience in data visualization tools and compliance with regulatory standards. Knowledge of risk classification, internal controls, and commercial banking products and services is desirable. Preferred qualifications include experience with Big Data and Cloud platforms, data wrangling tools, dynamic reporting applications like Tableau, and proficiency in data architecture, data mining, and analytical methodologies. Familiarity with job scheduling workflows, code versioning software, and change management tools would be advantageous.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Science Developer at our company located in Baner, Pune, you will be instrumental in creating and implementing data-driven solutions to address complex business issues. Your primary responsibilities will involve data analysis, model development, data pipeline creation, visualization, and collaboration with various teams. You will be tasked with collecting, cleaning, and preprocessing both structured and unstructured data, followed by conducting exploratory data analysis to reveal patterns and insights. Furthermore, your role will include designing, constructing, and enhancing machine learning and statistical models, ensuring their performance validation and robustness for deployment. Developing and managing end-to-end data pipelines for model deployment, as well as integrating models seamlessly into existing systems, will be crucial aspects of your job. Your ability to generate intuitive dashboards and reports to effectively communicate insights and meet business requirements will play a vital role in your day-to-day tasks. Collaboration is key in this role, as you will closely collaborate with data engineers, software developers, and business analysts to align with project objectives. Documenting processes, models, and insights for knowledge sharing will be an essential part of ensuring a cohesive team effort. To excel in this position, you should possess a Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a related field, along with a minimum of 3 to 4 years of experience in data science or related roles. Proficiency in programming languages such as Python or R, familiarity with machine learning libraries like scikit-learn, TensorFlow, PyTorch, and experience with data visualization tools are essential requirements. Additionally, knowledge of SQL, relational databases, and big data technologies like Hadoop or Spark would be advantageous. If you have strong problem-solving skills, attention to detail, and excellent communication and teamwork abilities, we invite you to apply for this full-time position with benefits including health insurance, life insurance, paid sick time, and Provident Fund. The work schedule is during day shifts, with a total work experience of 5 years being required for this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Principal/Sr. Lead Engineer Automation at FIS, you will be part of the team responsible for developing the next generation compliance product. Your role will involve playing a key part in automation for cloud-based applications, focusing on understanding data flow and configurations for multiple environments to expedite release/build verifications, ultimately improving quality and stability. Leveraging your expertise in scripting, AWS, Jenkins, and Snowflake, you will collaborate with cross-functional teams to solve challenging problems and drive strategic automation initiatives. Your responsibilities will include devising and utilizing automation frameworks and DevOps best practices to set up end-to-end automation pipelines, reviewing and validating data for uniformity and accuracy, analyzing results for failures, and interpreting them with clear objectives in mind. You will have the opportunity to work with multiple products and businesses, gaining a deep understanding of the trading and compliance domain. Key Requirements: - Minimum of five years of experience in AWS, Snowflake, and DevOps automation with a proven track record of delivering impactful solutions. - Strong SQL skills and proficiency in programming languages such as Python and Unix scripting. - Experience with Jenkins build pipeline and release deployment automation. - Strong analytical and problem-solving skills to translate business requirements into analytical solutions. - Excellent communication and presentation skills to convey complex technical concepts effectively. - Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, GCP) is a plus. - Demonstrated ability to work collaboratively in a team environment, manage multiple priorities, and automate data creation processes to reduce manual efforts. Responsibilities: - Define automation plan and own it end-to-end for release verification and CI/CD pipeline setup. - Understand product architecture and workflow to build an optimized automation pipeline for continuous delivery. - Collaborate with product and solution management teams to convert business use cases into efficient automation setups and execution. - Stay updated on the latest advancements in DevOps and AWS automation to leverage the latest concepts and methodologies. - Contribute to the development and implementation of best practices for DevOps, automation, and release deployment/verification. - Set up new and upgrade existing environments for automation pipeline and monitor them for failure analysis for daily sanity and regression verification. Qualifications: - Bachelor's or master's degree in computer science or a related field. Competencies: - Fluent in English. - Excellent communicator with the ability to discuss automation initiatives and provide optimized solutions. - Attention to detail and quality focus. - Organized approach to manage and adapt priorities according to client and internal requirements. - Self-starter with a team mindset, capable of working autonomously and as part of a global team. FIS offers a multifaceted job with a high degree of responsibility, visibility, and ownership, along with opportunities for growth and learning in the trading and compliance space. You will benefit from a competitive salary and benefits, a variety of career development tools, resources, and opportunities, and a supportive work environment. Join us at FIS, the global leader in financial technology solutions, and be part of a dynamic team dedicated to innovation and excellence.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The AVP, Business and Customer Analytics(L10) at Synchrony plays a crucial role in delivering high-impact projects by collaborating with various analytics teams to solve key business problems using data-driven solutions. As a part of the India Analytics Hub, you will work on projects that enable the company's growth and profitability through advanced analytics techniques. Your responsibilities will include supporting American Eagle business stakeholders with data-driven insights, leading analytics projects from inception to delivery, deriving actionable recommendations from data insights, and ensuring project timelines and accuracy are met. You will also contribute to internal initiatives, handle multiple projects, and demonstrate strong project management skills. The ideal candidate should hold a degree in Statistics, Mathematics, Economics, Engineering, or a related quantitative field with at least 4 years of hands-on experience in analytics or data science. Proficiency in SQL/SAS programming, Business Intelligence tools like Tableau & Power BI, Google Cloud Platform, ETL processes, and Big Data analytics is required. Experience in campaign sizing, customer targeting, and the financial services industry will be beneficial. Desired skills include working with Python/R, big data technologies like Hadoop/Hive/GCP, and report automation. Effective communication skills, the ability to lead strategic projects independently, and manage competing priorities are essential for this role. The role offers Enhanced Flexibility and Choice in work timings, requiring availability between 06:00 AM - 11:30 AM Eastern Time for meetings with global teams. Internal applicants should ensure they meet the eligibility criteria, inform their manager, update their professional profile, and upload an updated resume in the application process. If you are a motivated individual with a passion for analytics and a desire to drive business growth through data-driven solutions, this role provides an exciting opportunity to make a significant impact within the organization.,
Posted 1 week ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Responsibilities: This function covers incumbents responsible for various data activities, which include data analysis, maintenance, data quality and continuous interaction with business users to understand the requirements and convert those to the needed codes. Understanding of marketing data/Retail line of business is a plus. Day-to-day actions are focused on creating SAS codes to audit campaign data, execute campaigns ,identify deviations and analyze the correctness of the same. BAU also include reports being created and provided to business users for Retail line of business using SAS , Excel and planned migration to Tableau or equivalent approved reporting Tool. Knowledge of Autosys and Service now is an add on. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Technology Stack : Previous experience on SAS (intermediate-Expert) for creating reports /complex data sets Excel, Tableau/Equivalent reporting tool, Beginner/intermediate knowledge of: Python/Pyspark and Hadoop/Hive High attention to detail and analytical skills Logical approach to problem solving and good written and verbal communication skills ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France