Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 years
0 Lacs
Faridabad, Haryana, India
On-site
Job Description About Edspectrum Foundation: Edspectrum Foundation is a Section 8 non-profit Company started in 2019. Edspectrum Foundation is a social enterprise that caters to 4 aspects of education: early childhood education and consultancy, whole school transformation, education content creation and translation, and CSR program implementation. We have worked with organisations like Nascomm Foundation, Garden Valley International School, United India, Tag Hive, Cartwheel School, Shahi Exports, Baazi Games and more. Edspectrum Foundation began with a profound understanding of the non-linear nature of children's learning, development, and growth. We recognise that a child's learning journey is not confined to the classroom; it extends to their homes, communities, schools, and the places they explore. Our existence revolves around reaching every aspect of a community to facilitate each child's holistic learning, development, and growth. Our mission is to provide quality learning and development experiences to every child in India, empowering them to exercise their full potential in their adulthood, regardless of their socio-economic background. Our Philosophy Where we find ourselves today as adults, our lives are surrounded by various facets, whether it's our career, relationships, marriage, parents, or social life. We are continually engaged in efforts to enhance these aspects. As we grow older, the knowledge, mindset, and skills required to improve them become more challenging and demand greater effort over time. Initiating efforts with children from the early stages allows them to tap into their potential more effectively as adults. Learning and understanding are more accessible during childhood compared to the challenges faced in adulthood. Therefore, we begin our mission in early childhood, where working on children's potential is the easiest. If we miss this window, we can still collaborate with schools or after-school programs during their primary and secondary years, partnering with CSR initiatives. In cases where we can't reach children who need our assistance, we provide endowment funds for short-term projects to ensure we reach their potential in seemingly unreachable places. Even if we miss these opportunities, we continue to support youth and adults in their development and growth. Through collaborations with other organisations, we aim to provide accessible content, training, and mentorship to build specific skills for empowerment and potential development. Our mission is straightforward: to prepare individuals for adult life, equipping them with a powerful mindset and skills necessary to navigate the complexities of work, career, relationships, family, and social life. We aim to guide them towards realising their full potential, enabling them to handle these aspects confidently and proficiently, rather than struggling through them. Position Summary As a Teacher Champion , you will deliver high‑impact English lessons in under‑resourced schools, mentor peers, and drive continuous improvement. Location: Faridabad, Haryana & Delhi, Lajpat Nagar (field-based, completely) Working Hours: 11:00 AM – 5:30 PM (excluding travel time) Working Days: Mon–Fri Partner Schools: 4 Salary: 15000 INR Duration: June/July 2025 – March 2026 Reports to: Project Lead Type of Job: Contractual Joining: Immediate Key Responsibilities 1. Lesson Delivery & Student Engagement Teach LSRW classes/week per assigned classrooms in our partner schools. Design contextualised, interactive activities (role‑plays, games, art, etc.) linked to real life. Track and document attendance, participation, and formative assessments and baseline, midline and endline assessments weekly. Conduct and document weekly 5‑ 10 min speaking/listening calls with each class’s students; coordinate schedules with parents and Program Manager. 2. Self-Learning & Coaching Participate in all organisational learning/coaching programs on-site and off-site, and share reflections. Develop and deliver one monthly learning workshop for teacher peers. Help school teachers build their teaching skills through co‑teaching with you in coordination with Project Lead. 3. Parent & Community Engagement Assist in home visits to mobilise parents toward children’s learning and attendance. Coordinate parent calls for student absenteeism and share progress. Co-organise the National English Olympiad in each school with the Program Manager and proctor exams. Organise student 5-6 learning showcase events by demonstrating students’ English skills across all partner schools. 4. Content Creation & Reporting Post one weekly social media update highlighting classroom activities and student reflections. Write one monthly blog reflecting on teaching experiences and student learning. Submit a monthly report with student portfolios and activity and progress data to the Project Lead of your work. 5. Program Continuity & Backup Planning Build and maintain absence backup plans for each class (alternate activities/materials). Ensure your classes have substitutes or self‑learning modules when you are unavailable. (If you want, you can organise volunteers for your classrooms – a certificate will be issued to them if they work for at least a month with us) 6. Daily WhatsApp Content Delivery · Send contextualised reading‑comprehension passages, vocabulary tables, and MCQs via WhatsApp daily to reinforce learning. Mindset & Environment Passion for Youth: Love working with students aged 11–15. Resilient & Flexible: Thrives in fast‑paced, startup‑style conditions with limited comforts. Social Commitment: Motivated by social impact in underprivileged settings. The ideal candidate should have a deep love for children, regardless of their backgrounds, and be resourceful in meeting the diverse needs of our work. They should have a strong desire to provide support to the less privileged sections of society by actively engaging with these communities and their members despite hard and tough working conditions. Being resourceful is key to this role. Candidate should be okay with working in low-income private schools, government schools with harsh and tough conditions where heat, electricity, sanitation and hygiene could be a big problem for students and staff both. And lastly, love to work inthe social sector and want to make an impact in life of others. If someone is here for money and can drive themselves with a salary component, then you won't be fit for the work this project demands. Success Metric: You will know you have succeeded if: Your students show measurable improvement in English LSRW skills across baseline, midline, and endline assessments. Students regularly attend and participate in class, showing increased confidence and engagement . Your class is consistently supported with clear plans, backups, and learning showcases. Your lessons contribute to Edspectrum’s vision of building a sustainable, community-driven learning ecosystem . Core Values Embody and celebrate these values in all interactions with students, colleagues, and stakeholders. These values bind us together, fostering a sense of unity as we passionately and lovingly work towards our common mission for the well-being of our children. So, it is must to practice and build these values in your work with Edspectrum Foundation. Respect Empathy Courage Determination Appreciation Growth Mindset Bias for Action Integrity Qualifications & Experience A bachelor’s degree in education, English, or a related field; teaching certification is preferred. 0 – 2 years of experience teaching English in challenging or low‑resource contexts. Creativity and curiosity to create engaging, learner-centred lessons. Familiarity with WhatsApp, Google tools, and basic ed‑tech tools. Strong organisational, communication, interpersonal, adaptive and collaborative skills. Join us to transform English learning for underprivileged students and build a brighter future through language empowerment! Apply Mail your resume to Piyush Jain at piyush@edspectrumfoundation.org Show more Show less
Posted 2 days ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities Team: Core Engineering Reliability Team Collaborate with engineering and TPM leaders, developers, and process engineers to create data solutions that extract actionable insights from incident and post-incident management data, supporting objectives of incident prevention and reducing detection, mitigation, and communication times. Work with diverse stakeholders to understand their needs and design data models, acquisition processes, and applications that meet those requirements. Add new sources, implement business rules, and generate metrics to empower product analysts and data scientists. Serve as the data domain expert, mastering the details of our incident management infrastructure. Take full ownership of problems from ambiguous requirements through rapid iterations. Enhance data quality by leveraging and refining internal tools and frameworks to automatically detect issues. Cultivate strong relationships between teams that produce data and those that build insights. Qualifications Minimum Qualifications / Your background: BS in Computer Science or equivalent experience with 8+ years as a Senior Data Engineer or similar role 10+ Years of progressive experience in building scalable datasets and reliable data engineering practices. Proficiency in Python, SQL, and data platforms like DataBricks Proficiency in relational databases and query authoring (SQL). Demonstrable expertise designing data models for optimal storage and retrieval to meet product and business requirements. Experience building and scaling experimentation practices, statistical methods, and tools in a large scale organization Excellence in building scalable data pipelines using Spark (SparkSQL) with Airflow scheduler/executor framework or similar scheduling tools. Expert experience working with AWS data services or similar Apache projects (Spark, Flink, Hive, and Kafka). Understanding of Data Engineering tools/frameworks and standards to improve the productivity and quality of output for Data Engineers across the team. Well versed in modern software development practices (Agile, TDD, CICD) Desirable Qualifications Demonstrated ability to design and operate data infrastructure that deliver high reliability for our customers. Familiarity working with datasets like Monitoring, Observability, Performance, etc.. Benefits & Perks Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit go.atlassian.com/perksandbenefits . About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less
Posted 3 days ago
6.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Job Requirements Job Requirements Role/ Job Title: Senior Data Engineer Business: New Age Function/ Department: Data & Analytics Place of Work: Mumbai/Bangalore Roles & Responsibilities 'Minimum 6 years of Data Engineering experience and 3 years in large scale Data Lake ecosystem Proven expertise in SQL, Spark Python, Scala, Hadoop ecosystem, Have worked on multiple TBs/PBs of data volume from ingestion to consumption Work with business stakeholders to identify and document high impact business problems and potential solutions First-hand experience with the complete software development life cycle including requirement analysis, design, development, deployment, and support Advanced understanding of Data Lake/Lakehouse architecture and experience/exposure to Hadoop (cloudera,hortonworks) and AWS Work on end-to-end data lifecycle from Data Ingestion, Data Transformation and Data Consumption layer. Versed with API and its usability A suitable candidate will also be proficient Spark, Spark Streaming, AWS, and EMR A suitable candidate will also demonstrate machine learning experience and experience with big data infrastructure inclusive of MapReduce, Hive, HDFS, YARN, HBase, Oozie, etc. The candidate will additionally demonstrate substantial experience and a deep knowledge of data mining techniques, relational, and non-relational databases. Advanced skills in technical debugging of the architecture in case of issues Creating Technical Design Documentation (HLD/LLD) of the projects/pipelines Secondary Responsibilities 'Ability to work independently and handle your own development effort. Excellent oral and written communication skills Learn and use internally available analytic technologies Identify key performance indicators and establish strategies on how to deliver on these key points for analysis solutions Use educational background in data engineering and perform data mining analysis Work with BI analysts/engineers to create prototypes, implementing traditional classifiers and determiners, predictive and regressive analysis points Engage in the delivery and presentation of solutions Managerial & Leadership Responsibilities 'Lead moderately complex initiatives within Technology and contribute to large scale data processing framework initiatives related to enterprise strategy deliverables Build and maintain optimized and highly available data pipelines that facilitate deeper analysis and reporting Review and analyze moderately complex business, operational or technical challenges that require an in-depth evaluation of variable factors Oversee the data integration work, including integrating a data model with datalake, maintaining a data warehouse and analytics environment, and writing scripts for data integration and analysis Resolve moderately complex issues and lead teams to meet data engineering deliverables while leveraging solid understanding of data information policies, procedures and compliance requirements Collaborate and consult with colleagues and managers to resolve data engineering issues and achieve strategic goals Key Success Metrics 'Ensure timely deliverables. Spot Data fixes. Lead technical aspects of the projects. Error free deliverables. Show more Show less
Posted 3 days ago
170.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Job Summary Strategy Co-oridnate with leadership team and interfaces to define data quality solutions Partner with business team to evaluate new ideas or initiative initiated by technology or business team. Business Manage digitization and automation programs for improving client experience. Manage and work with architects, solution enginerrer , product team to facilitate prospecting Processes Responsible for business delivery aligning to SCB defined technology and project management processes & frameworks and customizing such to specific Cash needs where required Manage solutions delivery within timelines. People & Talent Train and mentor the newer/junior team members to come up the curve by having training documents in place and conducting knowledge sharing sessions. Monitor the progress of team members and provide continuous feedback for them to be able to progress and grow. Governance Audit engagement/management process compliance Maintain the list of traceability for each requirement and solution Key Responsibilities Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. [Fill in for regulated roles] Lead the team to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Work closely with the Product owners, Business Solution Leads and Development team to support the analysis and solutioning of various Payment initiatives. Work closely with Product managers, Business Solution Leads, Development team and other peer BAs in the team to ensure standardization for Payments platform across countries Suggests areas for improvement in internal processes along with possible solutions. Work closely with the Program, Remote Project Management Teams & Scrum masters to achieve key milestones, deliverables and tracking to ensure success of the overall project delivery Communicate effectively with System Development/Technology teams to establish appropriate solutions to meet business requirements. Support testing team as required ensuring product quality Other Responsibilities Embed Here for good and Group’s brand and values in team . Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); Scope & Solution Manage scope in line with Requirements , Solution and User stories in line with MVPs In case any deviation in scope then document it with approvals Review solution and analysis should be done properly. Gap should not arise post solution finalized. Contributing to architecture definition for large and complex projects and responsible for technical solution for projects falling with in a vertical or a domain Key Roles & Responsibilities Responsible for managing and providing business solutions for payment and clearing applications. Responsible for product development and requirement specifications for payment applications including understanding of business requirements, defining business and functional requirements and working with the business and development teams to support seamless project delivery. Engage with the country business to understand relevant trends and opportunities for payment products to better serve customers. Plan a roadmap of enhancements and initiatives that would deliver a stronger payment roadmap and work with the Segment Product Manager to identify suitable processes for funding, development and prioritization. Maximize technology investment for payment product development; ensuring criteria for project ranking and prioritization are clearly communicated to the business. Work closely with the Program & Remote Project Management Teams for supporting achievement of key milestones, deliverables and tracking to ensure success of the overall project delivery. Skills And Experience Excellent communication and stakeholder management Presentation and documentation skills Agile methodologies and process SQL queries, Oracle,DB2 Micro service architecture Data base structures and design Data Analytics and Dashboard tools Java, Springboots, Cloud architecture Industry experience in new payments product launch Industry experience in new launching features to clients Supporting streams for payments such as Screening, Auditing, Pricing and Billing, core banking etc Data Quality & Meta data tools like informatica Hands on experience & Knowledge on the following applications Hazelcast Hive Elastic search Dremio Kafka Avro Tableau & MicroStrategy Postgres Qualifications Experience and in-depth understanding in a product management/business analyst role in a financial institution, e-commerce or online environment. Strong analytical skills and able to assess multiple systems for troubleshooting. Good problem solver and decision maker. Ability to write well and experienced at writing various business documents (i.e. business requirements document, functional specifications, presentations and reports). Possess ability to perform interface mapping between upstream and downstream applications. Skills to execute basic SQL queries and perform System Testing for developed product. Excellent communications skills, confidence and ability to work with an international team in a cross-cultural and geographically dispersed workplace. Should possess qualities that would be useful in multiple stakeholder management. Motivated, able to work independently, proactively and efficiently in a fast paced and changing environment. Excited and passionate about Banking Business. About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less
Posted 3 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. At least two years (Over all 10+ hands on Data Engineering experience) of experience building and leading highly complex, technical data engineering teams. Lead data engineering team, from sourcing to closing. Drive strategic vision for the team and product Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Experience managing an data focused product,ML platform Hands on experience relevant experience in design, develop, and optimize scalable distributed data processing pipelines using Apache Spark and Scala. Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Experience managing, hiring and coaching software engineering teams. Experience with large-scale distributed web services and the processes around testing, monitoring, and SLAs to ensure high product quality. Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Required Skills: Experience: 7 to 10+ years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems. Proficiency in Functional Programming: High proficiency in Scala-based functional programming for developing robust and efficient data processing pipelines. Proficiency in Big Data Technologies: Strong experience with Apache Spark, Hadoop ecosystem tools such as Hive, HDFS, and YARN. AIRFLOW, DataOps, Data Management Programming and Scripting: Advanced knowledge of Scala and a good understanding of Python for data engineering tasks. Data Modeling and ETL Processes: Solid understanding of data modeling principles and ETL processes in big data environments. Analytical and Problem-Solving Skills: Strong ability to analyze and solve performance issues in Spark jobs and distributed systems. Version Control and CI/CD: Familiarity with Git, Jenkins, and other CI/CD tools for automating the deployment of big data applications. Desirable Experience: Real-Time Data Streaming: Experience with streaming platforms such as Apache Kafka or Spark Streaming.Python Data Engineering Experience is Plus Financial Services Context: Familiarity with financial data processing, ensuring scalability, security, and compliance requirements. Leadership in Data Engineering: Proven ability to work collaboratively with teams to develop robust data pipelines and architectures. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
TransUnion's Job Applicant Privacy Notice What We'll Bring We are India’s leading credit information company with one of the largest collections of consumer information. We aim to be more than just a credit reporting agency. We are a sophisticated, global risk information provider striving to use information for good. We take immense pride in playing a pivotal role in catalyzing the BFSI industry in the country. We got here by tapping into our excitement and passion of wanting to make a difference in the lives of our clients and consumers. What is excitement and passion for us? We define it as a blend of curiosity, ability to unlearn and yet continuously learn, able to connect with meaning and finally the drive to execute ideas till the last mile is achieved. This passion helps us focus on continuous improvement, creative problem solving and collaboration which ensures delivery excellence. What You'll Bring The incumbent will play a vital role in setting up data analytics technology practice in TransUnion UK. This practice will be having functions like Data Engineering, Data Research, Data Visualization, Analytics Systems and Innovation Culture. This role will be a bridge between GT and other verticals like ISG and DSA. Accelerating and transforming the Data Analytics Technology by helping TransUnion build more trust using advanced data analytics.Setup practice and this practice will include functions like Data Engineering, Data Analytics Research, Data Visualization, Analytics Systems and Innovation Culture Working closely with the CIO to plan the data analytics technology roadmap for the next 5 years Work with key stakeholders to ensure alignment of the objectives across business units and the action plans required to achieve the same Work with GT and other Leadership and align functions and roadmap to achieve business goals Ensure cloud readiness of data and systems and plan the roadmap for analytics in the cloud Incumbent will lead the development of big data capabilities and utilization as well as the coordination Setup and manage a team of Data Engineers, ML Engineers, Data Analysts, Data Admins and Architects Managing budgets, projects, people, stakeholders, vendors is an integral part of this role Prepare the Machine learning roadmap with Auto ML, ML Pipelines and Advanced Analytical tools Local Analytical system (capacity planning, costing, setup, tuning, management, user management, operations, cloud readiness planning) Impact You'll Make 5+ years of IT Experience with 4+ relevant experience in Data Analytics Technology Experience on Big Data Technology Stack like hive, spark, pig, sqoop, kafka, flume etc. Experience on Data Architecture and Data Governance Experience in managing a team of Data Engineers, ML Engineers, Data Analysts Experience in costing, capacity planning, architecture of advance data analytics ecosystem Experience in working across geographies, regions, countries Experience in Data Engineering, Data Visualization and Data Analytics tech stack This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Sr Analyst, Data Science and Analytics Show more Show less
Posted 3 days ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Apply fundamental knowledge of programming languages for design specifications. Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging Serve as advisor or coach to new or lower level analysts Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 5-7 years of relevant experience in design development using ETL Tools - AB-Initio / MS SQL Server Technology Must have worked on Various Ab-Initio Products like EME/Express-IT and Experience in Continuous Flows. Must be Familiar with ETL Concepts and good on Implementing them . Knowledge of NoSQL Databases/Hadoop-Hive, Spark will be added advantage Experience in systems analysis and programming of software applications Experience in managing and implementing successful projects Working knowledge of consulting/project management techniques/methods Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 days ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Apply fundamental knowledge of programming languages for design specifications. Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging Serve as advisor or coach to new or lower level analysts Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 5-7 years of relevant experience in design development using ETL Tools - AB-Initio / MS SQL Server Technology Must have worked on Various Ab-Initio Products like EME/Express-IT and Experience in Continuous Flows. Must be Familiar with ETL Concepts and good on Implementing them . Knowledge of NoSQL Databases/Hadoop-Hive, Spark will be added advantage Experience in systems analysis and programming of software applications Experience in managing and implementing successful projects Working knowledge of consulting/project management techniques/methods Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Lead Data Engineer – C12 / Assistant Vice President (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 8 to 12 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 days ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary The Chapter Lead Backend development is a role is a hands-on developer role focusing on back-end development and is accountable for people management and capability development of their Chapter members. Responsibilities in detail are: Responsibilities Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Maintain exemplary coding standards within the team, contributing to code base development and code repository management. Perform code reviews to guarantee quality and promote a culture of technical excellence in Java development. Function as a technical leader and active coder, setting and enforcing domain-specific best practices and technology standards. Allocate technical resources and personal coding time effectively, balancing leadership with hands-on development tasks. Maintain a dual focus on leadership and hands-on development, committing code while steering the chapter's technical direction. Oversee Java backend development standards within the chapter across squads, ensuring uniform excellence and adherence to best coding practices. Harmonize Java development methodologies across the squad, guiding the integration of innovative practices that align with the bank’s engineering strategies. Advocate for the adoption of cutting-edge Java technologies and frameworks, driving the evolution of backend practices to meet future challenges. Strategy Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Acts as a conduit for the wider domain strategy, for example technical standards. Prioritises and makes available capacity for technical debt. This role is around capability building, it is not to own applications or delivery. Actively shapes and drives towards the Bank-Wide engineering strategy and programmes to uplift standards and steer the technological direction towards excellence Act as a custodian for Java backend expertise, providing strategic leadership to enhance skill sets and ensure the delivery of high-performance banking solutions. Business Experienced practitioner and hands on contribution to the squad delivery for their craft (Eg. Engineering). Responsible for balancing skills and capabilities across teams (squads) and hives in partnership with the Chief Product Owner & Hive Leadership, and in alignment with the fixed capacity model. Responsible to evolve the craft towards improving automation, simplification and innovative use of latest market trends. Collaborate with product owners and other tech leads to ensure applications meet functional requirements and strategic objectives Processes Promote a feedback-rich environment, utilizing internal and external insights to continuously improve chapter operations. Adopt and embed the Change Delivery Standards throughout the lifecycle of the product / service. Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team. Follows the chapter operating model to ensure a system exists to continue to build capability and performance of the chapter. Chapter Lead may vary based upon the specific chapter domain its leading. People & Talent Accountable for people management and capability development of their Chapter members. Reviews metrics on capabilities and performance across their area, has improvement backlog for their Chapters and drives continual improvement of their chapter. Focuses on the development of people and capabilities as the highest priority. Risk Management Responsible for effective capacity risk management across the Chapter with regards to attrition and leave plans. Ensures the chapter follows the standards with respect to risk management as applicable to their chapter domain. Adheres to common practices to mitigate risk in their respective domain. Design and uphold a robust risk management plan, with contingencies for succession and role continuity, especially in critical positions Governance Ensure all artefacts and assurance deliverables are as per the required standards and policies (e.g., SCB Governance Standards, ESDLC etc.). Regulatory & Business Conduct Ensure a comprehensive understanding of and adherence to local banking laws, anti-money laundering regulations, and other compliance mandates. Conduct business activities with a commitment to legal and regulatory compliance, fostering an environment of trust and respect. Key stakeholders Chapter Area Lead Sub-domain Tech Lead Domain Architect Business Leads / Product owners Other Responsibilities Champion the company's broader mission and values, integrating them into daily operations and team ethos. Undertake additional responsibilities as necessary, ensuring they contribute to the organisation's strategic aims and adhere to Group and other Relevant policies. Skills And Experience Hands-on Java Development Leadership in System Architecture Database Proficiency CI / CD Container Platforms – Kubernetes / OCP / Podman Qualifications Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or related field, with preference given to advanced degrees. 10 years of professional Java development experience, including a proven record in backend system architecture and API design. At least 5 years in a leadership role managing diverse development teams and spearheading complex Java projects. Proficiency in a range of Java frameworks such as Spring, Spring Boot, and Hibernate, and an understanding of Apache Struts. Proficient in Java, with solid expertise in core concepts like object-oriented programming, data structures, and complex algorithms. Knowledgeable in web technologies, able to work with HTTP, RESTful APIs, JSON, and XML Expert knowledge of relational databases such as Oracle, MySQL, PostgreSQL, and experience with NoSQL databases like MongoDB, Cassandra is a plus. Familiarity with DevOps tools and practices, including CI/CD pipeline deployment, containerisation technologies like Docker and Kubernetes, and cloud platforms such as AWS, Azure, or GCP. Solid grasp of front-end technologies (HTML, CSS, JavaScript) for seamless integration with backend systems. Strong version control skills using tools like Git / Bitbucket, with a commitment to maintaining high standards of code quality through reviews and automated tests. Exceptional communication and team-building skills, with the capacity to mentor developers, facilitate technical skill growth, and align team efforts with strategic objectives. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to work effectively in a fast-paced, dynamic environment. About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for an immediate joiner and experienced Big Data Developer with a strong background in Kafka, PySpark, Python/Scala, Spark, SQL, and the Hadoop ecosystem. The ideal candidate should have over 5 years of experience and be ready to join immediately. This role requires hands-on expertise in big data technologies and the ability to design and implement robust data processing solutions. Responsibilities Design, develop, and maintain scalable data processing pipelines using Kafka, PySpark, Python/Scala, and Spark. Work extensively with the Kafka and Hadoop ecosystem, including HDFS, Hive, and other related technologies. Write efficient SQL queries for data extraction, transformation, and analysis. Implement and manage Kafka streams for real-time data processing. Utilize scheduling tools to automate data workflows and processes. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity by implementing robust data validation processes. Optimize existing data processes for performance and scalability. Requirements Experience with GCP. Knowledge of data warehousing concepts and best practices. Familiarity with machine learning and data analysis tools. Understanding of data governance and compliance standards. This job was posted by Arun Kumar K from krtrimaIQ Cognitive Solutions. Show more Show less
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us Acceldata is the market leader in Enterprise Data Observability. Founded in 2018, Silicon Valley-based Acceldata has developed the world's first Enterprise Data Observability Platform to help build and operate great data products. Enterprise Data Observability is at the intersection of today’s hottest and most crucial technologies such as AI, LLMs, Analytics, and DataOps. Acceldata provides mission-critical capabilities that deliver highly trusted and reliable data to power enterprise data products. Delivered as a SaaS product, Acceldata's solutions have been embraced by global customers, such as HPE, HSBC, Visa, Freddie Mac, Manulife, Workday, Oracle, PubMatic, PhonePe (Walmart), Hersheys, Dun & Bradstreet, and many more. Acceldata is a Series-C funded company whose investors include Insight Partners, March Capital, Lightspeed, Sorenson Ventures, Industry Ventures, and Emergent Ventures. About the Role: We are looking for an experienced Lead SDET for our ODP, specializing in ensuring the quality and performance of large-scale data systems. In this role, you will work closely with development and operations teams to design and execute comprehensive test strategies for Open Source Data Platform (ODP) , including Hadoop, Spark, Hive, Kafka, and other related technologies. You will focus on test automation, performance tuning, and identifying bottlenecks in distributed data systems. Your key responsibilities will include writing test plans, creating automated test scripts, and conducting functional, regression, and performance testing. You will be responsible for identifying and resolving defects, ensuring data integrity, and improving testing processes. Strong collaboration skills are essential as you will be interacting with cross-functional teams and driving quality initiatives. Your work will directly contribute to maintaining high-quality standards for big data solutions and enhancing their reliability at scale. You are a great fit for this role if you have Proven expertise in Quality Engineering, with a strong background in test automation, performance testing, and defect management across multiple data platforms. A proactive mindset to define and implement comprehensive test strategies that ensure the highest quality standards are met. Experience in working with both functional and non-functional testing, with a particular focus on automated test development. A collaborative team player with the ability to effectively work cross-functionally with development teams to resolve issues and deliver timely fixes. Strong communication skills with the ability to mentor junior engineers and share knowledge to improve testing practices across the team. A commitment to continuous improvement, with the ability to analyze testing processes and recommend enhancements to align with industry best practices. Ability to quickly learn new technologies What We Look For 6-10 years of hands-on experience in quality engineering and quality assurance, focusing on test automation, performance testing, and defect management across multiple data platforms Proficiency in programming languages such as Java, Python, or Scala for writing test scripts and automating test cases with hands-on experience in developing automated tests using other test automation frameworks, ensuring robust and scalable test suites. Proven ability to define and execute comprehensive test strategies, including writing test plans, test cases, and scripts for both functional and non-functional testing to ensure predictable delivery of high-quality products and solutions Experience with version control systems like Git and CI/CD tools such as Jenkins or GitLab CI to manage code changes and automate test execution within the development pipeline. Expertise in identifying, tracking, and resolving defects and issues, collaborating closely with developers and product teams to ensure timely fixes. Strong communication skills with the ability to work cross-functionally with development teams and mentor junior team members to improve testing practices and tools. Ability to analyze testing processes, recommend improvements and ensure the testing environment aligns with industry best practices, contributing to the overall quality of software. Acceldata is an equal-opportunity employer At Acceldata, we are committed to providing equal employment opportunities regardless of job history, disability, gender identity, religion, race, color, caste, marital/parental status, veteran status, or any other special status. We stand against the discrimination of employees and individuals and are proud to be an equitable workplace that welcomes individuals from all walks of life if they fit the designated roles and responsibilities. is all about working with some of the best minds in the industry and experiencing a culture that values an ‘out-of-the-box’ mindset. If you want to push boundaries, learn continuously, and grow to be the best version of yourself, Acceldata is the place to be! Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Amazon’s eCommerce Foundation (eCF) organization is responsible for the core components that drive the Amazon website and customer experience. Serving millions of customer page views and orders per day, eCF builds for scale. As an organization within eCF, the Business Data Technologies (BDT) group is no exception. We collect petabytes of data from thousands of data sources inside and outside Amazon including the Amazon catalog system, inventory system, customer order system, page views on the website. We provide interfaces for our internal customers to access and query the data hundreds of thousands of times per day, using Amazon Web Service’s (AWS) Redshift, Hive, Spark. We build scalable solutions that grow with the Amazon business. BDT team is building an enterprise-wide Big Data Marketplace leveraging AWS technologies. We work closely with AWS teams like EMR/Spark, Redshift, Athena, S3 and others. We are developing innovative products including the next-generation of data catalog, data discovery engine, data transformation platform, and more with state-of-the-art user experience. We’re looking for top engineers to build them from the ground up. This is a hands-on position where you will do everything from designing & building extremely scalable components to formulating strategy and direction for Big Data at Amazon. You will also mentor junior engineers and work with the most sophisticated customers in the business to help them get the best results. You need to not only be a top software developer with excellent programming skills, have an understanding of big data and parallelization, and a stellar record of delivery, but also excel at leadership, customer obsession and have a real passion for massive-scale computing. Come help us build for the future of Data! Key job responsibilities An SDE-II in the Datashield team would lead product and tech initiatives within the team and beyond by partnering with internal and external stakeholders and teams. They would need to come up with technical strategies and design for complex customer problems by leveraging out of box solutions to enable faster roll outs. They will deliver working software systems consisting of multiple features spanning the full software lifecycle including design, implementation, testing, deployment, and maintenance strategy. The problems they need to solve do not start with a defined technology strategy, and may have conflicting constraints. As technology lead in the team, they will review other SDEs’ work to ensure it fits into the bigger picture and is well designed, extensible, performant, and secure. Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language Bachelor's degree in computer science or equivalent Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience 1+ years of building large-scale machine-learning infrastructure for online recommendation, ads ranking, personalization or search experience Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A2952490 Show more Show less
Posted 3 days ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Amazon Selection and Catalog Systems (ASCS) builds the systems that host and run the comprehensive e-commerce product catalog. We power the online shopping experience for customers worldwide, enabling them to find, discover, and purchase anything they desire. Our scaled, distributed systems process hundreds of millions of updates across billions of products, including physical, digital, and service offerings. You will be part of Catalog Support Programs (CSP) team under Catalog Support Operations (CSO) in ASCS Org. CSP provides program management, technical support, and strategic initiatives to enhance the customer experience, owning the implementation of business logic and configurations for ASCS. We are establishing a new centralized Business Intelligence team to build self-service analytical products for ASCS that provide relevant insights and data deep dives across the business. By leveraging advanced analytics and AI/ML, we will transform catalog data into predictive insights, helping prevent customer issues before they arise. Real-time intelligence will support proactive decision-making, enabling faster, data-driven decisions across the organization and driving long-term growth and an enhanced customer experience. We are looking for a creative and goal-oriented BI Engineer to join our team to harness the full potential of data-driven insights to make informed decisions, identify business opportunities and drive business growth. This role requires an individual with excellent analytical abilities, knowledge of business intelligence solutions, as well as business acumen and the ability to work with various tech/product teams across ASCS. This BI Engineer will support ASCS org by owning complex reporting and automating reporting solutions, and ultimately provide insights and drivers for decision making. You must be a self-starter and be able to learn on the go. You should have excellent written and verbal communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. As a Business Intelligence Engineer in the CSP team, you will be responsible for analyzing petabytes of data to identify business trends and points of customer friction, and developing scalable solutions to enhance customer experience and safety. You will work closely with internal stakeholders to define key performance indicators (KPIs), implement them into dashboards and reports, and present insights in a concise and effective manner. This role will involve collaborating with business and tech leaders within ASCS and cross-functional teams to solve problems, create operational efficiencies, and deliver against high organizational standards. You should be able to apply a breadth of tools, data sources, and analytical techniques to answer a wide range of high-impact business questions and proactively uncover new insights that drive decision-making by senior leadership. As a key member of the CSP team, you will continually raise the bar on both quality and performance. You will bring innovation, a strategic perspective, a passionate voice, and an ability to prioritize and execute on a fast-moving set of priorities, competitive pressures, and operational initiatives. There will be a steep learning curve, adding a fair amount of business skills to the individual. Key job responsibilities Work closely with BIEs, Data Engineers, and Scientists in the team to collaborate effectively with product managers and create scalable solutions for business problems Create program goals and related metrics, track progress, and manage through obstacles to help the team achieve objectives Identify opportunities for improvement or automation in existing data processes and lead the changes using business acumen and data handling skills Ensure best practices on data integrity, design, testing, implementation, documentation, and knowledge sharing Contribute to supplier operations strategy development based on data analysis Lead strategic projects to formalize and scale organizational processes Build and manage weekly, monthly, and quarterly business review metrics Build data reports and dashboards using SQL, Excel, and other tools to improve business efficiency across programs Understand loosely defined or structured problems and provide BI solutions for difficult problems, delivering large-scale BI solutions Provide solutions that drive the team's business decisions and highlight new opportunities Improve code quality and optimize BI processes Demonstrate proficiency in a scripting language, data modeling, data pipeline design, and applying basic statistical methods (e.g., regression) for difficult business problems A day in the life A day in the life of a BIE-II will include: Working closely with cross-functional teams including Product/Program Managers, Software Development Managers, Applied/Research/Data Scientists, and Software Developers Building dashboards, performing root cause analysis, and sharing actionable insights with stakeholders to enable data-informed decision making Leading reporting and analytics initiatives to drive data-informed decision making Designing, developing, and maintaining ETL processes and data visualization dashboards using Amazon QuickSight Transforming complex business requirements into actionable analytics solutions. About The Team This central BIE team within ASCS will be responsible for building a structured analytical data layer, bringing in BI discipline by defining metrics in a standardized way and establishing a single definition of metrics across the catalog ecosystem. They will also identify clear sources of truth for critical data. The team will build and maintain the data pipelines for critical projects tailored to the needs of ASCS teams, leveraging catalog data to provide a unified view of product information. This will support real-time decision-making and empower teams to make data-driven decisions quickly, driving innovation. This team will leverage advanced analytics that can shift us to a proactive, data-driven approach, enabling informed decisions that drive growth and enhance the customer experience. This team will adopt best practices, standardize metrics, and continuously iterate on queries and data sets as they evolve. Automated quality controls and real-time monitoring will ensure consistent data quality across the organization. Basic Qualifications 4+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business Experience writing complex SQL queries Bachelor's degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field Experience with scripting languages (e.g., Python, Java, R) and big data technologies/languages (e.g. Spark, Hive, Hadoop, PyTorch, PySpark) to build and maintain data pipelines and ETL processes Demonstrate proficiency in SQL, data analysis, and data visualization tools like Amazon QuickSight to drive data-driven decision making. Experience applying basic statistical methods (e.g. regression, t-test, Chi-squared) as well as exploratory, deterministic, and probabilistic analysis techniques to solve complex business problems. Experience gathering business requirements, using industry standard business intelligence tool(s) to extract data, formulate metrics and build reports. Track record of generating key business insights and collaborating with stakeholders. Strong verbal and written communication skills, with the ability to effectively present data insights to both technical and non-technical audiences, including senior management Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Master's degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field Proven track record of conducting large-scale, complex data analysis to support business decision-making in a data warehouse environment Demonstrated ability to translate business needs into data-driven solutions and vice versa Relentless curiosity and drive to explore emerging trends and technologies in the field Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis, as well as exploratory, deterministic, and probabilistic analysis techniques Experience in designing and implementing custom reporting systems using automation tools Knowledge of how to improve code quality and optimizes BI processes (e.g. speed, cost, reliability) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2990532 Show more Show less
Posted 3 days ago
8.0 years
8 - 9 Lacs
Gurgaon
On-site
You Lead the Way. We've Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities, and each other. Here, you'll learn and grow as we help you create a career journey that's unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you'll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company's success. Together, we'll win as a team, striving to uphold our company values and powerful backing promise to provide the world's best customer experience every day. And we'll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. American Express has embarked on an exciting transformation driven by an energetic new team of high performers. This is a great opportunity to join the Customer Marketing organization within American Express Technologies and become a driver of this exciting journey. We are looking for a highly skilled and experienced Senior Engineer with a history of building Bigdata, GCP Cloud, Python and Spark applications. The Senior Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization's data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Joining the Enterprise Marketing team, this role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. We pride ourselves on a culture of kindness and positivity, and a continuous focus on supporting colleague development to help you achieve your career goals. We lead with integrity, and we emphasize work/life balance for all of our teammates. How will you make an impact in this role? There are hundreds of opportunities to make your mark on technology and life at American Express. Here's just some of what you'll be doing: As a part of our team, you will be developing innovative, high quality, and robust operational engineering capabilities. Develop software in our technology stack which is constantly evolving but currently includes Big data, Spark, Python, Scala, GCP, Adobe Suit ( like Customer Journey Analytics ). Work with Business partners and stakeholders to understand functional requirements, architecture dependencies, and business capability roadmaps. Create technical solution designs to meet business requirements. Define best practices to be followed by team. Taking your place as a core member of an Agile team driving the latest development practices Identify and drive reengineering opportunities, and opportunities for adopting new technologies and methods. Suggest and recommend solution architecture to resolve business problems. Perform peer code review and participate in technical discussions with the team on the best solutions possible. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers' digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. American Express offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology of #TeamAmex. Minimum Qualifications : BS or MS degree in computer science, computer engineering, or other technical discipline, or equivalent work experience. 8+ years of hands-on software development experience with Big Data & Analytics solutions – Hadoop Hive, Spark, Scala, Hive, Python, shell scripting, GCP Cloud Big query, Big Table, Airflow. Working knowledge of Adobe suit like Adobe Experience Platform, Adobe Customer Journey Analytics Proficiency in SQL and database systems, with experience in designing and optimizing data models for performance and scalability. Design and development experience with Kafka, Real time ETL pipeline, API is desirable. Experience in designing, developing, and optimizing data pipelines for large-scale data processing, transformation, and analysis using Big Data and GCP technologies. Certifications in cloud platform (GCP Professional Data Engineer) is a plus. Understanding of distributed (multi-tiered) systems, data structures, algorithms & Design Patterns. Strong Object-Oriented Programming skills and design patterns. Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git, Maven). Good knowledge and experience with configuration management tools like GitHub Ability to analyze complex data engineering problems, propose effective solutions, and implement them effectively. Looks proactively beyond the obvious for continuous improvement opportunities. Communicates effectively with product and cross functional team. Willingness to learn new technologies and leverage them to their optimal potential. Understanding of various SDLC methodologies, familiarity with Agile & scrum ceremonies. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 3 days ago
3.0 years
5 - 8 Lacs
Gurgaon
Remote
Job description About this role Want to elevate your career by being a part of the world's largest asset manager? Do you thrive in an environment that fosters positive relationships and recognizes stellar service? Are analyzing complex problems and identifying solutions your passion? Look no further. BlackRock is currently seeking a candidate to become part of our Global Investment Operations Data Engineering team. We recognize that strength comes from diversity, and will embrace your rare skills, eagerness, and passion while giving you the opportunity to grow professionally and as an individual. We know you want to feel valued every single day and be recognized for your contribution. At BlackRock, we strive to empower our employees and actively engage your involvement in our success. With over USD $11.5 trillion of assets under management, we have an extraordinary responsibility: our technology and services empower millions of investors to save for retirement, pay for college, buy a home and improve their financial well-being. Come join our team and experience what it feels like to be part of an organization that makes a difference. Technology & Operations Technology & Operations(T&O) is responsible for the firm's worldwide operations across all asset classes and geographies. The operational functions are aligned with clients, products, fund structures and our Third-party provider networks. Within T&O, Global Investment Operations (GIO) is responsible for the development of the firm's operating infrastructure to support BlackRock's investment businesses worldwide. GIO spans Trading & Market Documentation, Transaction Management, Collateral Management & Payments, Asset Servicing including Corporate Actions and Cash & Asset Operations, and Securities Lending Operations. GIO provides operational service to BlackRock's Portfolio Managers and Traders globally as well as industry leading service to our end clients. GIO Engineering Working in close partnership with GIO business users and other technology teams throughout Blackrock, GIO Engineering is responsible for developing and providing data and software solutions that support GIO business processes globally. GIO Engineering solutions combine technology, data, and domain expertise to drive exception-based, function-agnostic, service-orientated workflows, data pipelines, and management dashboards. The Role – GIO Engineering Data Lead Work to date has been focused on building out robust data pipelines and lakes relevant to specific business functions, along with associated pools and Tableau / PowerBI dashboards for internal BlackRock clients. The next stage in the project involves Azure / Snowflake integration and commercializing the offering so BlackRock’s 150+ Aladdin clients can leverage the same curated data products and dashboards that are available internally. The successful candidate will contribute to the technical design and delivery of a curated line of data products, related pipelines, and visualizations in collaboration with SMEs across GIO, Technology and Operations, and the Aladdin business. Responsibilities Specifically, we expect the role to involve the following core responsibilities and would expect a successful candidate to be able to demonstrate the following (not in order of priority) Design, develop and maintain a Data Analytics Infrastructure Work with a project manager or drive the project management of team deliverables Work with subject matter experts and users to understand the business and their requirements. Help determine the optimal dataset and structure to deliver on those user requirements Work within a standard data / technology deployment workflow to ensure that all deliverables and enhancements are provided in a disciplined, repeatable, and robust manner Work with team lead to understand and help prioritize the team’s queue of work Automate periodic (daily/weekly/monthly/Quarterly or other) reporting processes to minimize / eliminate associated developer BAU activities. Leverage industry standard and internal tooling whenever possible in order to reduce the amount of custom code that requires maintenance Experience 3+ years of experience in writing ETL, data curation and analytical jobs using Hadoop-based distributed computing technologies: Spark / PySpark, Hive, etc. 3+ years of knowledge and Experience of working with large enterprise databases preferably Cloud bases data bases/ data warehouses like Snowflake on Azure or AWS set-up Knowledge and Experience in working with Data Science / Machine / Gen AI Learning frameworks in Python, Azure/ openAI, meta tec. Knowledge and Experience building reporting and dashboards using BI Tools: Tableau, MS PowerBI, etc. Prior Experience working on Source Code version Management tools like GITHub etc. Prior experience working with and following Agile-based workflow paths and ticket-based development cycles Prior Experience setting-up infrastructure and working on Big Data analytics Strong analytical skills with the ability to collect, organize, analyse, and disseminate significant amounts of information with attention to detail and accuracy Experience working with SMEs / Business Analysts, and working with Stakeholders for sign-off Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Job Requisition # R254094
Posted 3 days ago
0.0 - 2.0 years
0 Lacs
Pune
On-site
The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Essential: Experience primarily with SAP Business Objects and Talend ETL Experience with any one other Business Intelligence tools like Tableau/COGNOS, ETL tools - Abinitio/Spark and unix shell scripting Experience in RDBMS, preferably Oracle with SQL query writing skills. Good understating of Data-warehousing concepts like Schema, Facts/Dimensions. Should be able to understand and modify complex universe queries, design and use the functionalities of Web Intelligence tool. Familiarity with identification and resolution of data quality issues. Strong and effective inter-personal and communication skills and the ability to interact professionally with a business user. Great team player with a passion to collaborate with colleagues. Knowledge of any application server (Weblogic, WAS, Tomcat etc) Adjacent Skills: Apache Spark with java Good Understanding of Bigdata and Hadoop ecosystem Good Understanding of Hive and Impala Testing frameworks (test driven development) Good communication skills Knowledge of Maven, Python scripting skills Good problem solving skills Beneficial: EMS, Kafka, Domain Knowledge - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 days ago
3.0 years
0 Lacs
Bengaluru
On-site
MoAt CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, tar gets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things. Job Title: Senior Associate Data Scientist Location: Bangalore Business & Team: Home Buying Decision Science Impact & contribution: The Senior Associate Data Scientist will use technical knowledge and understanding of business domain to deliver moderate or highly complex data science projects independently or with minimal guidance. You will also engage and collaborate with business stakeholders to clearly articulate findings to solve business problems. Roles & Responsibilities: Analyse complex data sets to extract insights and identify trends. Develop predictive models and algorithms to solve business problems. Work on deployment of models in production. Collaborate with cross-functional teams to understand requirements and deliver data-driven solutions. Clean, preprocess, and manipulate data for analysis through programming. Communicate findings and recommendations to stakeholders through reports and presentations. Stay updated with industry trends and best practices in data science. Contribute to the development and improvement of data infrastructure and processes. Design experiments and statistical analysis to validate hypotheses and improve models. Continuously learn and enhance skills in data science techniques and tools. Strongly support the adoption of data science across the organization. Identify problems in the products, services and operations of the bank and solve those with innovative research driven solutions. Essential Skills: Strong hands-on programming experience in Python (mandatory), R, SQL, Hive and Spark. More than 3 years of relevant experience. Ability to write well designed, modular and optimized code. Knowledge of H2O.ai, GitHub, Big Data and ML Engineering. Knowledge of commonly used data structures and algorithms. Good to have: Knowledge of Time Series, NLP and Deep Learning and Generative AI is preferred. Good to have: Knowledge and hands-on experience in developing solutions with Large Language Models. Must have been part of projects building and deploying predictive models in production (financial services domain preferred) involving large and complex data sets. Strong problem solving and critical thinking skills. Curious, fast learning capability and team player attitude is a must. Ability to communicate clearly and effectively. Demonstrated expertise through blogposts, research, participation in competitions, speaking opportunities, patents and paper publications. Most importantly - ability to identify and translate theories into real applications to solve practical problems. Preferred Skills: Good to have: Knowledge and hands-on data engineering or model deployment Experience in Data Science in either of Credit Risk, Pricing Modelling and Monitoring, Sales and Marketing, Campaign Analytics, Ecommerce Retail or banking products for retail or business banking is preferred. Solid foundation of Statistics and core ML algorithms at a mathematical (under the hood) level. Education Qualifications : Bachelor’s degree in Engineering in Computer Science/Information Technology. If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career. We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696. Advertising End Date: 25/06/2025
Posted 3 days ago
8.0 years
0 Lacs
Bengaluru
On-site
Job Description Citi Analytics & Information Management (AIM) team is a global community that objectively connects and analyzes information to create actionable intelligence for our business leaders. It identifies fact-based opportunities for revenue growth and expense reduction in partnership with the businesses. The role of C12 (Individual Contributor) AVP is within Fraud Analytics team in Citi AIM. The primary area of focus for this position is to analyze transaction data, understand fraud pattern , develop fraud loss mitigation strategies with the objective of overall business goal of minimizing fraud losses as well as minimizing customer impact. The person will also be responsible for monitoring strategy performance, collaborate with strategy implementation team for strategy implementation, proactively come up with fraud loss mitigation measure leveraging new data sources, advanced analytics techniques wherever applicable. Job Description Summary: The individual is expected to be hands-on with analysis on regular and / or ad hoc basis, extract different data sources not only limited to transactions depending upon business objective, generate fraud risk insights , recommend business solution as well as regular monitoring of strategy performance , optimize existing rules to improve rule performance. The individual should be able to work on Logistic Regression/ ML Model development irrespective of development environment regardless of the programming language, provide thought leadership for all data solution that includes designing, developing and implementing Machine Learning solutions Individual should be able to have a holistic view of different retail banking products, best practices and integrating analytical thinking and knowledge of business along with data analysis tools and methodologies to develop client centric solutions and recommendations. Analytics experience preferably in BFSI domain with proficiency in basic statistics, hypothesis testing, segmentation and predictive modeling.Comfortable in decision tree (CHAID/CART), Logistic Regression, exploratory data analysis. Strong in SAS,SQL,Hive,Impala and Excel. Knowledge in Python is desirable. Prior experience in Fraud Analytics is preferrable. Knowledge in Tableau or any other data visualization tool is preferrable. Experience in stakeholder management across various functions and regions. Translate data into consumer or customer behavioral insights to drive targeting and segmentation strategies and communicate clearly and effectively to business partners and senior leaders. Developed communication and diplomacy skills are required in order to exchange potentially complex/sensitive information. Presentation Skills : Delivering clean and clear presentations to share the thoughts, the solutions or the problem for business stakeholders and senior management. Project Management – Should have skillset to manage project in terms of creating project plan, assigning responsibilities amongst junior team members, and completion of projects in timely fashion and escalating managing and reporting control issues with transparency. Own and deliver multiple and complex analytic projects. This would require an understanding of business context, conversion of business problems in business solutions and/or modeling and implementing such solutions to create economic value. The individual is expected to support regulatory/audit activities and /or other risk and control activities whenever needed. Qualifications: 8+ years of analytics experience including model development, prior experience in Fraud analytics preferrable Good knowledge of Python, SQL, and Big Data. Working knowledge of the pros, cons, and usage of various ML / DL applications (such as Keras, TensorFlow, Python Scikit, etc.) Advanced analytical and business strategy skills Effective communication skills Ability to present to business partners and/or leaders to gain approvals. Project and process management skills Excellent written and verbal communications skill Experience with a prior focus in financial services analytics. Solid organizational skills and ability to manage multiple projects at one time. Self-starter who also has a demonstrated ability to work successfully in team environment and drive. Education: Bachelors/University degree or equivalent experience. Master’s degree is preferable. - Job Family Group: Decision Management - Job Family: Specialized Analytics (Data Science/Computational Statistics) - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 days ago
5.0 years
2 - 2 Lacs
Bengaluru
On-site
Organization: At CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things. Job Title: Data Scientist Location : Bangalore Business & Team: RBS Decision Science Impact & contribution: The Data Scientist will use technical knowledge and understanding of business domain to own and deliver moderate or highly complex data science projects independently or with minimal guidance. You also will engage and collaborate with business stakeholders to clearly articulate findings to solve business problems. Roles & Responsibilities: Lead data-driven initiatives, from problem formulation to model deployment, leveraging advanced statistical techniques and machine learning algorithms. Drive the development and implementation of scalable data solutions, ensuring accuracy and reliability of predictive models. Collaborate with business stakeholders to define project goals, prioritize tasks, and deliver actionable insights. Design and execute experiments to evaluate model performance and optimize algorithms for maximum efficiency. Develop and deploy production-grade machine learning models in cloud-based and on-prem platforms. Lead cross-functional teams in the design and execution of data science projects, ensuring alignment with business objectives. Stay abreast of emerging technologies and industry trends, continuously enhancing expertise in data science methodologies and tools. Drive innovation by exploring new approaches and techniques for solving complex business problems through data analysis and modelling. Mentor junior team members, providing guidance on best practices and technical skills development. Strongly support the adoption of data science across the organization. Identify problems in the products, services and operations of the bank and solve those with innovative research driven solutions. Essential Skills: Strong hands-on programming experience in Python (mandatory), R, SQL, Hive and Spark. 5+ years of experience in above skills Ability to write well designed, modular and optimized code. Knowledge of H2O.ai, GitHub, Big Data and ML Engineering. Knowledge of Snowflake, AWS, Azure etc. Knowledge of commonly used data structures and algorithms. Solid foundation of Statistics and core ML algorithms at a mathematical (under the hood) level. Must have been part of projects building and deploying predictive models in production (financial services domain preferred) involving large and complex data sets. Experience in Data Science in Pricing, Credit Risk, Marketing, Campaign Analytics, Ecommerce Retail or banking products for retail or business banking is preferred. Good to have: Knowledge of Time Series, NLP and Deep Learning and Generative AI is preferred. Good to have: Knowledge and hands-on experience in developing solutions with Large Language Models. Good to have: familiarity with agentic coding such as Roo code and Cline Built and deployed large scale software applications. Understanding of principles of software engineering and cloud computing. Strong problem solving and critical thinking skills. Curious, fast learning capability and team player attitude is a must. Ability to communicate clearly and effectively. Demonstrated expertise through blogposts, research, participation in competitions, speaking opportunities, patents and paper publications. Most importantly - ability to identify and translate theories into real applications to solve practical problems. Education Qualifications: Bachelor’s degree in Engineering Or Master’s degree Or Ph.D. in Data Science/ Machine Learning/ Computer Science/ Computational Linguistics/ Statistics/ Mathematics/Engineering. If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career. We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696. Advertising End Date: 02/07/2025
Posted 3 days ago
10.0 years
9 - 10 Lacs
Chennai
On-site
Job ID: 5285 Location: Chennai, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 14 Jun 2025 Job Summary The Chapter Lead Backend development is a role is a hands-on developer role focusing on back-end development and is accountable for people management and capability development of their Chapter members. Responsibilities in detail are: RESPONSIBILITIES Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Maintain exemplary coding standards within the team, contributing to code base development and code repository management. Perform code reviews to guarantee quality and promote a culture of technical excellence in Java development. Function as a technical leader and active coder, setting and enforcing domain-specific best practices and technology standards. Allocate technical resources and personal coding time effectively, balancing leadership with hands-on development tasks. Maintain a dual focus on leadership and hands-on development, committing code while steering the chapter's technical direction. Oversee Java backend development standards within the chapter across squads, ensuring uniform excellence and adherence to best coding practices. Harmonize Java development methodologies across the squad, guiding the integration of innovative practices that align with the bank’s engineering strategies. Advocate for the adoption of cutting-edge Java technologies and frameworks, driving the evolution of backend practices to meet future challenges. Strategy Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Acts as a conduit for the wider domain strategy, for example technical standards. Prioritises and makes available capacity for technical debt. This role is around capability building, it is not to own applications or delivery. Actively shapes and drives towards the Bank-Wide engineering strategy and programmes to uplift standards and steer the technological direction towards excellence Act as a custodian for Java backend expertise, providing strategic leadership to enhance skill sets and ensure the delivery of high-performance banking solutions. Business Experienced practitioner and hands on contribution to the squad delivery for their craft (Eg. Engineering). Responsible for balancing skills and capabilities across teams (squads) and hives in partnership with the Chief Product Owner & Hive Leadership, and in alignment with the fixed capacity model. Responsible to evolve the craft towards improving automation, simplification and innovative use of latest market trends. Collaborate with product owners and other tech leads to ensure applications meet functional requirements and strategic objectives Processes Promote a feedback-rich environment, utilizing internal and external insights to continuously improve chapter operations. Adopt and embed the Change Delivery Standards throughout the lifecycle of the product / service. Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team. Follows the chapter operating model to ensure a system exists to continue to build capability and performance of the chapter. Chapter Lead may vary based upon the specific chapter domain its leading. People & Talent Accountable for people management and capability development of their Chapter members. Reviews metrics on capabilities and performance across their area, has improvement backlog for their Chapters and drives continual improvement of their chapter. Focuses on the development of people and capabilities as the highest priority. Risk Management Responsible for effective capacity risk management across the Chapter with regards to attrition and leave plans. Ensures the chapter follows the standards with respect to risk management as applicable to their chapter domain. Adheres to common practices to mitigate risk in their respective domain. Design and uphold a robust risk management plan, with contingencies for succession and role continuity, especially in critical positions Governance Ensure all artefacts and assurance deliverables are as per the required standards and policies (e.g., SCB Governance Standards, ESDLC etc.). Regulatory & Business Conduct Ensure a comprehensive understanding of and adherence to local banking laws, anti-money laundering regulations, and other compliance mandates. Conduct business activities with a commitment to legal and regulatory compliance, fostering an environment of trust and respect. Key stakeholders Chapter Area Lead Sub-domain Tech Lead Domain Architect Business Leads / Product owners Other Responsibilities Champion the company's broader mission and values, integrating them into daily operations and team ethos. Undertake additional responsibilities as necessary, ensuring they contribute to the organisation's strategic aims and adhere to Group and other Relevant policies. Skills and Experience Hands-on Java Development Leadership in System Architecture Database Proficiency CI / CD Container Platforms – Kubernetes / OCP / Podman Qualifications Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or related field, with preference given to advanced degrees. 10 years of professional Java development experience, including a proven record in backend system architecture and API design. At least 5 years in a leadership role managing diverse development teams and spearheading complex Java projects. Proficiency in a range of Java frameworks such as Spring, Spring Boot, and Hibernate, and an understanding of Apache Struts. Proficient in Java, with solid expertise in core concepts like object-oriented programming, data structures, and complex algorithms. Knowledgeable in web technologies, able to work with HTTP, RESTful APIs, JSON, and XML Expert knowledge of relational databases such as Oracle, MySQL, PostgreSQL, and experience with NoSQL databases like MongoDB, Cassandra is a plus. Familiarity with DevOps tools and practices, including CI/CD pipeline deployment, containerisation technologies like Docker and Kubernetes, and cloud platforms such as AWS, Azure, or GCP. Solid grasp of front-end technologies (HTML, CSS, JavaScript) for seamless integration with backend systems. Strong version control skills using tools like Git / Bitbucket, with a commitment to maintaining high standards of code quality through reviews and automated tests. Exceptional communication and team-building skills, with the capacity to mentor developers, facilitate technical skill growth, and align team efforts with strategic objectives. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to work effectively in a fast-paced, dynamic environment. About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www.sc.com/careers
Posted 3 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering, ETL. Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment. Show more Show less
Posted 3 days ago
8.0 - 10.0 years
0 Lacs
Andhra Pradesh
On-site
Software Engineering Advisor - HIH - Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Position Summary: Data engineer on the Data integration team Job Description & Responsibilities: Work with business and technical leadership to understand requirements. Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements. Write performant queries using Teradata SQL, Hive SQL and Spark SQL against Teradata and Hive Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS / Azure / GCP. Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives. Experience Required: Overall 8-10 years of experience Experience Desired: Strong development experience in Spark, Py-Spark, Shell scripting, Teradata. Strong experience in writing complex and effective SQLs (using Teradata SQL, Hive SQL and Spark SQL) and Stored Procedures Health care domain knowledge is a plus Education and Training Required: Primary Skills: Excellent work experience on Databricks as Data Lake implementations Experience in Agile and working knowledge on DevOps tools (Git, Jenkins, Artifactory) Experience in AWS (S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch) / GCP / Azure Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure / AWS integration Additional Skills: Experience in Jira and Confluence Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives. Location & Hours of Work (hybrid, Hyderabad ) (11:30am-8:30PM) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 3 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions Create Proof of concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects Influence machine learning strategy for Digital programs and projects Make solution recommendations that appropriately balance speed to market and analytical soundness Explore design options to assess efficiency and impact, develop approaches to improve robustness and rigor Develop analytical / modelling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow) Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Create algorithms to extract information from large, multiparametric data sets Deploy algorithms to production to identify actionable insights from large databases Compare results from various methodologies and recommend optimal techniques Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Develop and embed automated processes for predictive model validation, deployment, and implementation Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment Lead discussions at peer review and use interpersonal skills to positively influence decision making Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices Facilitate cross-geography sharing of new ideas, learnings, and best-practices Requirements Bachelor of Science or Bachelor of Engineering at a minimum. 4+ years of work experience as a Data Scientist A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) Good hands-on skills in both feature engineering and hyperparameter optimization Experience producing high-quality code, tests, documentation Experience with Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Azure ML, Synapse, Databricks Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and / or deep learning methodologies Proficiency in statistical concepts and ML algorithms Good knowledge of Agile principles and process Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results Self-motivated and a proactive problem solver who can work independently and in teams Show more Show less
Posted 3 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Hive is a popular data warehousing tool used for querying and managing large datasets in distributed storage. In India, the demand for professionals with expertise in Hive is on the rise, with many organizations looking to hire skilled individuals for various roles related to data processing and analysis.
These cities are known for their thriving tech industries and offer numerous opportunities for professionals looking to work with Hive.
The average salary range for Hive professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
Typically, a career in Hive progresses from roles such as Junior Developer or Data Analyst to Senior Developer, Tech Lead, and eventually Architect or Data Engineer. Continuous learning and hands-on experience with Hive are crucial for advancing in this field.
Apart from expertise in Hive, professionals in this field are often expected to have knowledge of SQL, Hadoop, data modeling, ETL processes, and data visualization tools like Tableau or Power BI.
As you explore job opportunities in the field of Hive in India, remember to showcase your expertise and passion for data processing and analysis. Prepare well for interviews by honing your skills and staying updated with the latest trends in the industry. Best of luck in your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2