Jobs
Interviews

5402 Hive Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

10 Lacs

Hyderābād

On-site

Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities include, but not limited to: Strong desire to grow a career as a Data Scientist in highly automated industrial manufacturing doing analysis and machine learning on terabytes and petabytes of diverse datasets. Experience in the areas: statistical modeling, feature extraction and analysis, supervised/unsupervised/semi-supervised learning. Exposure to the semiconductor industry is a plus but not a requirement. Ability to extract data from different databases via SQL and other query languages and applying data cleansing, outlier identification, and missing data techniques. Strong software development skills. Strong verbal and written communication skills. Experience with or desire to learn: Machine learning and other advanced analytical methods Fluency in Python and/or R pySpark and/or SparkR and/or SparklyR Hadoop (Hive, Spark, HBase) Teradata and/or another SQL databases Tensorflow, and/or other statistical software including scripting capability for automating analyses SSIS, ETL Javascript, AngularJS 2.0, Tableau Experience working with time-series data, images, semi-supervised learning, and data with frequently changing distributions is a plus Experience working with Manufacturing Execution Systems (MES) is a plus Existing papers from CVPR, NIPS, ICML, KDD, and other key conferences are plus, but this is not a research position About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.

Posted 6 days ago

Apply

8.0 years

5 - 10 Lacs

Bengaluru

On-site

Are you intellectually curious and have a passion for promoting solutions across organizational boundaries? Join the Consumer & Community Banking (CCB) Stress Testing Transformation team for a dynamic opportunity to design and build creative solutions for the future of stress testing and annual CCAR exercises. As an Sr Associate in the Stress Testing Transformation Solution team, you will be a strategic thinker and passionate about designing and building creative solutions for the future of Stress Testing (Quarterly Stress testing and Annual Comprehensive Capital Analysis and Review exercises). You will spend your time solving complex problems, demonstrating strategic thought leadership, and designing / changing the way our stakeholders operate. Leveraging a deep understanding of CCB Stress Testing process and extensive Finance domain knowledge, you will build scalable solutions that optimize process efficiencies and the use of data assets and advance platform capabilities Job responsibilities • Collaborate with cross functional teams to lead the design and implementation of end-to-end solutions for Stress Testing, assessing and addressing business problems with different technical solutions • Provide expertise in process re-engineering and guidance based on “Roadmap” for large-scale Stress Testing transformation initiatives • Assess, challenge, and solution on Stress Testing end-to-end process focusing on source of data, with the ability to influence and drive the roadmap • Evaluate, recommend, and develop solutions and architecture including integration with APIs, Python, AI/ML technology with other enterprise applications • Leverage data and best in class tools, improve processes and controls, enable cross business application, embracing a consistent framework • Convert complex issues and break it down into simple, manageable steps or achievements • Eliminate manual reporting and reengineer process increasing ability to generate insights faster through and integrated data and platform approach Required qualifications, capabilities, and skills • Bachelor’s degree in engineering or related field • Experience with business intelligence, analytics, and data wrangling tools such as Alteryx, SAS, or Python • Experience with relational databases optimizing SQL to pull and summarize large datasets, report creation and ad-hoc analyses • Experience with Hive, Spark SQL, Impala or other big-data query tools • Demonstrated ability to think beyond raw data and to understand the underlying business context and sense business opportunities hidden in data • Ability to collaborate with global teams and deliver in a fast paced, results driven environment • Possess a transformation mindset with strong problem solving and analytical skills Preferred qualifications, capabilities, and skills • Experience with Databricks, and/or SQL and Python, or other data platform. • 8+ years experiences in Analytics Solution, Data Analytics preferably related to financial services domain

Posted 6 days ago

Apply

5.0 years

4 - 6 Lacs

Bengaluru

On-site

Degree, Post graduate in Computer Science or related field (or equivalent industry experience)with background in Mathematics and Statistics Minimum 5+ years of development and design experience in experience as Data Engineer Experience on Big Data platforms and distributed computing (e.g. Hadoop, Map/Reduce, Spark, HBase, Hive) Experience in data pipeline software engineering and best practice in python (linting, unit tests, integration tests, git flow/pull request process, object-oriented development, data validation, algorithms and data structures, technical troubleshooting and debugging, bash scripting ) Experience in Data Quality Assessment (profiling, anomaly detection) and data documentation (schema, dictionaries) Experience in data architecture, data warehousing and modelling techniques (Relational, ETL, OLTP) and consider performance alternatives Used SQL, PL/SQL or T-SQL with RDBMSs production environments, no-SQL databases nice to have Linux OS configuration and use, including shell scripting. Well versed with Agile, DevOps and CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem Experience in Agile methodology. Ensure quality of technical and application architecture and design of systems across the organization. Effectively research and benchmark technology against other best in class technologies. Experience in Banking, Financial and Fintech experience in an enterprise environment preferred Able to influence multiple teams on technical considerations, increasing their productivity and effectiveness, by sharing deep knowledge and experience. Self-motivator and self-starter, Ability to own and drive things without supervision and works collaboratively with the teams across the organization. Have excellent soft and interpersonal skills to interact and present the ideas to team. The engineer should've good listening skills and speaks clearly in front of team, stakeholders and management. The engineer should always carry positive attitude towards work and establishes effective team relations and builds a climate of trust within the team. Should be enthusiastic and passionate and creates a motivating environment for the team. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 6 days ago

Apply

9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Who We Are Wayfair is moving the world so that anyone can live in a home they love – a journey enabled by more than 3,000 Wayfair engineers and a data-centric culture. Wayfair’s Advertising business is rapidly expanding, adding hundreds of millions of dollars in profits to Wayfair. We are building Sponsored Products, Display & Video Ad offerings that cater to a variety of Advertiser goals while showing highly relevant and engaging Ads to millions of customers. We are evolving our Ads Platform to empower advertisers across all sophistication levels to grow their business on Wayfair at a strong, positive ROI and are leveraging state of the art Machine Learning techniques. The Advertising Optimization & Automation Science team is central to this effort. We leverage machine learning and generative AI to streamline campaign workflows, delivering impactful recommendations on budget allocation, target Return on Ad Spend (tROAS), and SKU selection. Additionally, we are developing intelligent systems for creative optimization and exploring agentic frameworks to further simplify and enhance advertiser interactions. We are looking for an experienced Senior Machine Learning Scientist to join the Advertising Optimization & Automation Science team. In this role, you will be responsible for building intelligent, ML-powered systems that drive personalized recommendations and campaign automation within Wayfair’s advertising platform. You will work closely with other scientists, as well as members of our internal Product and Engineering teams, to apply your ML expertise to define and deliver 0-to-1 capabilities that unlock substantial commercial value and directly enhance advertiser outcomes. What You’ll do Design and build intelligent budget, tROAS, and SKU recommendations, and simulation-driven decisioning that extends beyond the current advertising platform capabilities. Lead the next phase of GenAI-powered creative optimization and automation to drive significant incremental ad revenue and improve supplier outcomes. Raise technical standards across the team by promoting best practices in ML system design and development. Partner cross-functionally with Product, Engineering, and Sales to deliver scalable ML solutions that improve supplier campaign performance. Ensure systems are designed for reuse, extensibility, and long-term impact across multiple advertising workflows. Research and apply best practices in advertising science, GenAI applications in creative personalization, and auction modeling. Keep Wayfair at the forefront of innovation in supplier marketing optimization. Collaborate with Engineering teams (AdTech, ML Platform, Campaign Management) to build and scale the infrastructure needed for automated, intelligent advertising decisioning. We Are a Match Because You Have : Bachelor's or Master’s degree in Computer Science, Mathematics, Statistics, or related field. 9+ years of experience in building large scale machine learning algorithms. 4+ years of experience working in an architect or technical leadership position. Strong theoretical understanding of statistical models such as regression, clustering and ML algorithms such as decision trees, neural networks, transformers and NLP techniques. Proficiency in programming languages such as Python and relevant ML libraries (e.g., TensorFlow, PyTorch) to develop production-grade products. Strategic thinker with a customer-centric mindset and a desire for creative problem solving, looking to make a big impact in a growing organization. Demonstrated success influencing senior level stakeholders on strategic direction based on recommendations backed by in-depth analysis; Excellent written and verbal communication. Ability to partner cross-functionally to own and shape technical roadmaps Intellectual curiosity and a desire to always be learning! Nice to have Experience with GCP, Airflow, and containerization (Docker). Experience building scalable data processing pipelines with big data tools such as Hadoop, Hive, SQL, Spark, etc. Familiarity with Generative AI and agentic workflows. Experience in Bayesian Learning, Multi-armed Bandits, or Reinforcement Learning. About Wayfair Inc. Wayfair is one of the world’s largest online destinations for the home. Through our commitment to industry-leading technology and creative problem-solving, we are confident that Wayfair will be home to the most rewarding work of your career. If you’re looking for rapid growth, constant learning, and dynamic challenges, then you’ll find that amazing career opportunities are knocking. No matter who you are, Wayfair is a place you can call home. We’re a community of innovators, risk-takers, and trailblazers who celebrate our differences, and know that our unique perspectives make us stronger, smarter, and well-positioned for success. We value and rely on the collective voices of our employees, customers, community, and suppliers to help guide us as we build a better Wayfair – and world – for all. Every voice, every perspective matters. That’s why we’re proud to be an equal opportunity employer. We do not discriminate on the basis of race, color, ethnicity, ancestry, religion, sex, national origin, sexual orientation, age, citizenship status, marital status, disability, gender identity, gender expression, veteran status, genetic information, or any other legally protected characteristic. We are interested in retaining your data for a period of 12 months to consider you for suitable positions within Wayfair. Your personal data is processed in accordance with our Candidate Privacy Notice (which can found here: https://www.wayfair.com/careers/privacy). If you have any questions regarding our processing of your personal data, please contact us at dataprotectionofficer@wayfair.com. If you would rather not have us retain your data please contact us anytime at dataprotectionofficer@wayfair.com.

Posted 6 days ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Pune

Work from Office

Experience: 4+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.

Posted 6 days ago

Apply

0.0 - 3.0 years

0 - 0 Lacs

Surat, Gujarat

On-site

Job Overview: We are seeking a highly skilled and experienced Cloud Engineer with expertise in Java and Apache Flink to join our dynamic data engineering team. This role involves building scalable, real-time data pipelines and backend components while managing cloud infrastructure (AWS or GCP). The ideal candidate has a strong foundation in Java development, distributed stream processing, and hands-on experience with cloud-native data systems. Key Responsibilities: Java Backend Development: Write clean, efficient, and well-tested Java code Build POJOs and custom serializers for data processing Manage project dependencies using Maven or Gradle Apache Flink – Real-Time Stream Processing: Develop real-time data pipelines using Apache Flink Utilize Flink’s streaming and batch modes effectively Work with event time vs processing time concepts Implement Flink stateful operations (keyed and operator state) Set up checkpointing, fault-tolerance, and recovery via savepoints Optimize task execution using Flink parallelism, slots, and task chaining Data Integration & Connectors: Integrate Flink with Kafka (source/sink connectors) (Bonus) Experience with Kinesis or Google Pub/Sub Write data to various sinks such as Elasticsearch and MySQL Cloud Engineering: Design and manage scalable cloud-based infrastructure on AWS or GCP Ensure high availability, reliability, and performance of backend services Collaborate with DevOps teams on CI/CD and deployment strategies Required Skills & Qualifications: 3+ years of Java development experience 2+ years of hands-on experience with Apache Flink Strong understanding of distributed stream processing Experience with Kafka integration (source/sink) Familiarity with Elasticsearch, MySQL as data sinks Proficiency with Maven or Gradle build tools Solid grasp of event-driven architecture and real-time systems Experience working with cloud environments (AWS or GCP) Preferred Qualifications: Experience with Google Pub/Sub or Amazon Kinesis Prior experience building microservices and containerized apps Familiarity with CI/CD tools, monitoring, and logging frameworks Knowledge of other big data tools (e.g., Spark, Hive) is a plus Why Join Us? Work on cutting-edge data engineering and cloud projects Collaborate with a high-performing and passionate team 5-day work week and a strong focus on work-life balance Competitive salary and performance-based growth Learning opportunities with modern tools and cloud technologies Take the lead in transforming how real-time data powers decision-making across systems. Let’s build the future together. Job Type: Full-time Pay: ₹40,000.00 - ₹74,500.00 per month Benefits: Flexible schedule Health insurance Leave encashment Paid time off Ability to commute/relocate: Surat, Gujarat: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: Flink Java Developer: 2 years (Required) Cloud Engineer: 3 years (Required) Location: Surat, Gujarat (Required) Work Location: In person Speak with the employer +91 9904361666

Posted 6 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Qualification Skills: 5+ years of experience with Java + Bigdata as minimum required skill . Java, Micorservices ,Sprintboot, API ,Bigdata-Hive, Spark,Pyspark Role Skills: 5+ years of experience with Java + Bigdata as minimum required skill . Java, Micorservices ,Sprintboot, API ,Bigdata-Hive, Spark,Pyspark Experience 5 to 7 years Job Reference Number 13049

Posted 6 days ago

Apply

12.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Qualification BTech degree in computer science, engineering or related field of study or 12+ years of related work experience 7+ years design & implementation experience with large scale data centric distributed applications Professional experience architecting, operating cloud-based solutions with good understanding of core disciplines like compute, networking, storage, security, databases etc. Good understanding of data engineering concepts like storage, governance, cataloging, data quality, data modeling etc. Good understanding about various architecture patterns like data lake, data lake house, data mesh etc. Good understanding of Data Warehousing concepts, hands-on experience working with tools like Hive, Redshift, Snowflake, Teradata etc. Experience migrating or transforming legacy customer solutions to the cloud. Experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, Data Zone etc. Thorough understanding of Big Data ecosystem technologies like Hadoop, Spark, Hive, HBase etc. and other competent tools and technologies Understanding in designing analytical solutions leveraging AWS cognitive services like Textract, Comprehend, Rekognition etc. in combination with Sagemaker is good to have. Experience working with modern development workflows, such as git, continuous integration/continuous deployment pipelines, static code analysis tooling, infrastructure-as-code, and more. Experience with a programming or scripting language – Python/Java/Scala AWS Professional/Specialty certification or relevant cloud expertise Role Drive innovation within Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries. Capable of leading a technology team, inculcating innovative mindset and enable fast paced deliveries. Able to adapt to new technologies, learn quickly, and manage high ambiguity. Ability to work with business stakeholders, attend/drive various architectural, design and status calls with multiple stakeholders. Exhibit good presentation skills with a high degree of comfort speaking with executives, IT Management, and developers. Drive technology/software sales or pre-sales consulting discussions Ensure end-to-end ownership of all tasks being aligned. Ensure high quality software development with complete documentation and traceability. Fulfil organizational responsibilities (sharing knowledge & experience with other teams / groups) Conduct technical training(s)/session(s), write whitepapers/ case studies / blogs etc. Experience 10 to 18 years Job Reference Number 12895

Posted 6 days ago

Apply

3.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Qualification OLAP, Data Engineering, Data warehousing, ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake Experience in writing and troubleshooting SQL programming or MDX queries Experience of working on Linux Experience in Microsoft Analysis services (SSAS) or OLAP tools Tableau or Micro strategy or any BI tools Expertise of programming in Python, Java or Shell Script would be a plus Role Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for prospects regarding technical issues during POV stage. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience 3 to 6 years Job Reference Number 10350

Posted 6 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Intellismith Intellismith, founded in 2019, is a dynamic HR service and technology startup. Our mission is to tackle India’s employability challenges head-on. We specialize in scaling talent acquisition and technology resource outsourcing. Also, as an IBM and Microsoft Business Partner, we leverage industry-leading solutions to enhance and diversify our offerings. As we chart our growth trajectory, we’re transitioning from a service-centric model to a product-focused company. Our journey involves building a cutting-edge skilling platform to empower Indian youth with domain-specific training, making them job-ready for the competitive market. Why Join Intellismith? Impactful Mission: Be part of a forward-thinking organisation committed to solving employability challenges. Your work directly contributes to bridging the skills gap and transforming lives. Innovation and Growth: Contribute to our exciting transition from services to products. Shape the future of our skilling platform and impact Indian youth positively. Collaborative Environment: Work alongside talented professionals across multiple locations. Our diverse teams foster creativity and learning. Entrepreneurial Spirit: Intellismith encourages fresh ideas and entrepreneurial thinking. Your voice matters here. As a leading outsourcing partners, we are hiring a SQL Developer to work on a project for our client, which is the largest provider of telecoms and mobile money services in 14 countries spanning Sub-Saharan, Central, and Western Africa. Job Details: Experience: Min 5 years of relevant experience in in writing complex SQL Qualification: BE / B Tech / MCA / BCA / MTech. Location: Gurugram (WFO - 5 days) CTC Bracket: 28 LPA Notice Period: Immediate to 15 days (Candidates with notice period of less than 30 days are preferred). Mandatory Skills: Must have experience of working in SQL designing & development . Must have experience of writing complex SQL queries for data retrieval and manipulation on RDBS. Must have experience of databases development. Responsibilities: Designing, developing, and maintaining databases Writing complex SQL queries for data retrieval and manipulation on RDBS and Big Data eco system Optimizing database performance and ensuring data integrity Build appropriate and useful reporting deliverables. Analyze existing SQL queries for performance improvements. Troubleshooting and resolving database-related issues Collaborating with cross-functional teams to gather requirements and implement solutions Creating and maintaining database documentation Implementing and maintaining database security measures Required Skills: Strong proficiency in SQL and database concepts Good experience working on hive tables, trio query and big data eco system for data retrieval Experience with database development tools and technologies like Oracle, PostgresSQL, etc. Familiarity with performance tuning and query optimization Knowledge of data modeling and database design principles #SQLDeveloper #SQLQuery #Complexquery #Database #Immediatejoiners #Career #ITJobs #Hiring

Posted 6 days ago

Apply

5.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Qualification Required Proven hands-on experience on designing, developing and supporting Database projects for analysis in a demanding environment. Proficient in database design techniques – relational and dimension designs Experience and a strong understanding of business analysis techniques used. High proficiency in the use of SQL or MDX queries. Ability to manage multiple maintenance, enhancement and project related tasks. Ability to work independently on multiple assignments and to work collaboratively within a team is required. Strong communication skills with both internal team members and external business stakeholders Added Advanatage Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake will be an added advantage. Experience of working on Linux system Experience of Tableau or Micro strategy or Power BI or any BI tools will be an added advantage. Expertise of programming in Python, Java or Shell Script would be a plus Role Roles & Responsibilities Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for customers regarding technical issues during the project. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience 5 to 10 years Job Reference Number 11078

Posted 6 days ago

Apply

3.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Qualification Pre-Sales Solution Engineer - India Experience Areas Or Skills Pre-Sales experience of Software or analytics products Excellent verbal & written communication skills OLAP tools or Microsoft Analysis services (MSAS) Data engineering or Data warehouse or ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Tableau or Micro strategy or any BI tool Hive QL or Spark SQL or PLSQL or TSQL Writing and troubleshooting SQL programming or MDX queries Working on Linux programming in Python, Java or Java Script would be a plus Filling RFP or Questioner from Customer NDA, Success Criteria, Project closure and other Documentation Be willing to travel or relocate as per requirement Role Acts as main point of contact for Customer contacts involved in the evaluation process Product demonstrations to qualified leads Product demonstrations in support of marketing activity such as events or webinars Own RFP, NDA, PoC success criteria document, POC Closure and other documents Secures alignment on Process and documents with the customer / prospect Owns the technical win phases of all active opportunities Understand Customer domain and database schema Providing OLAP and Reporting solution Work closely with customers for understanding and resolving environment or OLAP cube or reporting related issues Co-ordinate with solutioning team for execution of PoC as per success plan Creates enhancement requests or identify requests for new features on behalf of customers or hot prospects Experience 3 to 6 years Job Reference Number 10771

Posted 6 days ago

Apply

8.0 - 10.0 years

30 - 32 Lacs

Hyderabad

Work from Office

Candidate Specifications: Candidate should have 9+ years of experience. Candidates should have 9+ years of experience in Python and Pyspark Candidate should have strong experience in AWS and PLSQL. Candidates should be strong in Data management with data governance and data streaming along with data lakes and data-warehouse Candidates should also have exposure in Team handling and stakeholder management skills. Candidate should have excellent in written and verbal communication skills. Contact Person: Sheena Rakesh

Posted 6 days ago

Apply

15.0 years

0 Lacs

Greater Lucknow Area

On-site

Qualification 15+ years of experience in the role of managing and implementing of high-end software products. Expertise in Java/ J2EE or EDW/SQL OR Hadoop/Hive/Spark and preferably hands-on. Good knowledge* of any of the Cloud (AWS/Azure/GCP) – Must Have Managed/ delivered and implemented complex projects dealing with considerable data size (TB/ PB) and with high complexity Experience in handling migration projects Good To Have Data Ingestion, Processing and Orchestration knowledge Role Senior Technical Project Managers (STPMs) are in charge of handling all aspects of technical projects. This is a multi-dimensional and multi-functional role. You will need to be comfortable reporting program status to executives, as well as diving deep into technical discussions with internal engineering teams and external partners. You should collaborate with, and leverage, colleagues in business development, product management, analytics, marketing, engineering, and partner organizations. You have to manage multiple projects and ensures all releases on time. You are responsible for manage and deliver the technical solution to support an organization’s vision and strategic direction. The technology program manager delivers the technical solution to support an organization’s vision and strategic direction. You should be capable to working with a different type of customer and should possess good customer handling skills. Experience in working in ODC model and capable of presenting the Technical Design and Architecture to Senior Technical stakeholders. Should have experience in defining the project and delivery plan for each assignment Capable of doing resource allocations as per the requirements for each assignment Should have experience of driving RFPs. Should have experience of Account management – Revenue Forecasting, Invoicing, SOW creation etc. Experience 15 to 20 years Job Reference Number 13010

Posted 6 days ago

Apply

175.0 years

0 Lacs

Bengaluru South, Karnataka, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? As a Data Engineer, you will be responsible for designing, developing, and maintaining robust and scalable framework/services/application/pipelines for processing huge volume of data. You will work closely with cross-functional teams to deliver high-quality software solutions that meet our organizational needs. Key Responsibilities: Design and develop solutions using Bigdata tools and technologies like Bigquery, Hive, Spark etc. Extensive hands-on experience in object-oriented programming using Python, PySpark APIs etc. Experience in building data pipelines for huge volume of data. Experience in designing, implementing, and managing various ETL job execution flows. Experience in implementing and maintaining Data Ingestion process. Hands on experience in writing basic to advance level of optimized queries using HQL, SQL & Spark. Hands on experience in designing, implementing, and maintaining Data Transformation jobs using most efficient tools/technologies. Ensure the performance, quality, and responsiveness of solutions. Participate in code reviews to maintain code quality. Should be able to write shell scripts. Utilize Git for source version control. Set up and maintain CI/CD pipelines. Troubleshoot, debug, and upgrade existing application & ETL job chains. Required Skills and Qualifications: Bachelor’s degree in Computer Science Engineering, or a related field. Proven experience as Data Engineer or similar role. Strong proficiency in Object Oriented programming using Python. Experience with ETL jobs design principles. Solid understanding of HQL, SQL and data modelling. Knowledge on Unix/Linux and Shell scripting principles. Familiarity with Git and version control systems. Experience with Jenkins and CI/CD pipelines. Knowledge of software development best practices and design patterns. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Experience with cloud platforms such as Google Cloud. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 6 days ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express Team Overview: Global Credit & Model Risk Oversight, Transaction Monitoring & GRC Capabilities (CMRC) provides independent challenge and ensures that significant Credit and Model risks are properly evaluated and monitored, and Anti-Money Laundering (AML) risks are mitigated through the transaction monitoring program. In addition, CMRC hosts the central product organization responsible for the ongoing maintenance and modernization of GRC platforms and capabilities. How will you make an impact in this role? The AML Data Capabilities team was established with a mission to own and govern data across products – raw data, derivations, organized views to cater for analytics and production use cases and to manage the end-to-end data quality. This team comprises of risk data experts with deep SME knowledge of risk data, systems and processes covering all aspects of customer life cycle. Our mission is to build and support Anti-Money Laundering Transaction Monitoring data and rule needs in collaboration with Strategy and technology partners with focus on our core tenets of Timeliness , Quality and process efficiency. Responsibilities include: · Develop and Maintain Organized Data Layers to cater for both Production use cases and Analytics for Transaction Monitoring of Anti-Money Laundering rules. · Manage end to end Big Data Integration processes for building key variables from disparate source systems with 100% accuracy and 100% on time delivery · Partner closely with Strategy and Modeling teams in building incremental intelligence, with strong emphasis on maintaining globalization and standardization of attribute calculations across portfolios. · Partner with Tech teams in designing and building next generation data quality controls. · Drive automation initiatives within existing processes and fully optimize delivery effort and processing time · Effectively manage relationship with stakeholders across multiple geographies · Contribute into evaluating and/or developing right tools, common components, and capabilities · Follow industry best agile practices to deliver on key priorities Implementation of defined rules on Lucy platform in order to identify the AML alerts. · Ensuring process and actions are logged and support regulatory reporting, documenting the analysis and the rule build in form of qualitative document for relevant stakeholders. Minimum Qualifications · Academic Background: Bachelor’s degree with up to 2 year of relevant work experience · Strong Hive, SQL skills, knowledge of Big data and related technologies · Hands on experience on Hadoop & Shell Scripting is a plus · Understanding of Data Architecture & Data Engineering concepts · Strong verbal and written communication skills, with the ability to cater to versatile technical and non-technical audience · Willingness to Collaborate with Cross-Functional teams to drive validation and project execution · Good to have skills - Python / Py-Spark · Excellent Analytical & critical thinking with attention to detail · Excellent planning and organizations skills including ability to manage inter-dependencies and execute under stringent deadlines · Exceptional drive and commitment; ability to work and thrive in in fast changing, results driven environment; and proven ability in handling competing priorities Behavioral Skills/Capabilities: Enterprise Leadership Behaviors Set the Agenda: Ø Ability to apply thought leadership and come up with ideas Ø Take complete perspective into picture while designing solutions Ø Use market best practices to design solutions Bring Others with You: Ø Collaborate with multiple stakeholders and other scrum team to deliver on promise Ø Learn from peers and leaders Ø Coach and help peers Do It the Right Way: Ø Communicate Effectively Ø Be candid and clear in communications Ø Make Decisions Quickly & Effectively Ø Live the company culture and values We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 6 days ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. Join Team Amex and let's lead the way together. With a focus on digitization, innovation, and analytics, the Enterprise Digital teams creates central, scalable platforms and customer experiences to help markets across all of these priorities. Charter is to drive scale for the business and accelerate innovation for both immediate impact as well as long-term transformation of our business. A unique aspect of Enterprise Digital Teams is the integration of diverse skills across all its remit. Enterprise Digital Teams has a very broad range of responsibilities, resulting in a broad range of initiatives around the world. The American Express Enterprise Digital Experimentation & Analytics (EDEA) leads the Enterprise Product Analytics and Experimentation charter for Brand & Performance Marketing and Digital Acquisition & Membership experiences as well as Enterprise Platforms. The focus of this collaborative team is to drive growth by enabling efficiencies in paid performance channels & evolve our digital experiences with actionable insights & analytics. The team specializes in using data around digital product usage to drive improvements in the acquisition customer experience to deliver higher satisfaction and business value. About this Role: This role will report to the Manager of Membership experience analytics team within Enterprise Digital Experimentation & Analytics (EDEA) and will be based in Gurgaon. The candidate will be responsible for delivery of highly impactful analytics to optimize our Digital Membership Experiences across Web & App channels. Deliver strategic analytics focused on Digital Membership experiences across Web & App aimed at optimizing our Customer experiences Define and build key KPIs to monitor the acquisition journey performance and success Support the development of new products and capabilities Deliver read out of experiments uncovering insights and learnings that can be utilized to further optimize the customer journey Gain deep functional understanding of the enterprise-wide product capabilities and associated platforms over time and ensure analytical insights are relevant and actionable Power in-depth strategic analysis and provide analytical and decision support by mining digital activity data along with AXP closed loop data Minimum Qualifications Advanced degree in a quantitative field (e.g. Finance, Engineering, Mathematics, Computer Science) Strong programming skills are preferred. Some experience with Big Data programming languages (Hive, Spark), Python, SQL. Experience in large data processing and handling, understanding in data science is a plus. Ability to work in a dynamic, cross-functional environment, with strong attention to detail. Excellent communication skills with the ability to engage, influence, and encourage partners to drive collaboration and alignment. Preferred Qualifications Strong analytical/conceptual thinking competence to solve unstructured and complex business problems and articulate key findings to senior leaders/partners in a succinct and concise manner. Basic knowledge of statistical techniques for experimentation & hypothesis testing, regression, t-test, chi-square test. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 6 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary The Chapter Lead Backend development is a rolerole is a hands-on developer role focusing on back-end development and is accountable for people management and capability development of their Chapter members. Responsibilities in detail are: Responsibilities Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Maintain exemplary coding standards within the team, contributing to code base development and code repository management. Perform code reviews to guarantee quality and promote a culture of technical excellence in Java development. Function as a technical leader and active coder, setting and enforcing domain-specific best practices and technology standards. Allocate technical resources and personal coding time effectively, balancing leadership with hands-on development tasks. Maintain a dual focus on leadership and hands-on development, committing code while steering the chapter's technical direction. Oversee Java backend development standards within the chapter across squads, ensuring uniform excellence and adherence to best coding practices. Harmonize Java development methodologies across the squad, guiding the integration of innovative practices that align with the bank’s engineering strategies. Advocate for the adoption of cutting-edge Java technologies and frameworks, driving the evolution of backend practices to meet future challenges. Strategy Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Acts as a conduit for the wider domain strategy, for example technical standards. Prioritises and makes available capacity for technical debt. This role is around capability building, it is not to own applications or delivery. Actively shapes and drives towards the Bank-Wide engineering strategy and programmes to uplift standards and steer the technological direction towards excellence Act as a custodian for Java backend expertise, providing strategic leadership to enhance skill sets and ensure the delivery of high-performance banking solutions. Business Experienced practitioner and hands on contribution to the squad delivery for their craft (Eg. Engineering). Responsible for balancing skills and capabilities across teams (squads) and hives in partnership with the Chief Product Owner & Hive Leadership, and in alignment with the fixed capacity model. Responsible to evolve the craft towards improving automation, simplification and innovative use of latest market trends. Collaborate with product owners and other tech leads to ensure applications meet functional requirements and strategic objectives Processes Promote a feedback-rich environment, utilizing internal and external insights to continuously improve chapter operations. Adopt and embed the Change Delivery Standards throughout the lifecycle of the product / service. Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team. Follows the chapter operating model to ensure a system exists to continue to build capability and performance of the chapter. Chapter Lead may vary based upon the specific chapter domain its leading. People & Talent Accountable for people management and capability development of their Chapter members. Reviews metrics on capabilities and performance across their area, has improvement backlog for their Chapters and drives continual improvement of their chapter. Focuses on the development of people and capabilities as the highest priority. Risk Management Responsible for effective capacity risk management across the Chapter with regards to attrition and leave plans. Ensures the chapter follows the standards with respect to risk management as applicable to their chapter domain. Adheres to common practices to mitigate risk in their respective domain. Design and uphold a robust risk management plan, with contingencies for succession and role continuity, especially in critical positions. Governance Ensure all artefacts and assurance deliverables are as per the required standards and policies (e.g., SCB Governance Standards, ESDLC etc.). Regulatory & Business Conduct Ensure a comprehensive understanding of and adherence to local banking laws, anti-money laundering regulations, and other compliance mandates. Conduct business activities with a commitment to legal and regulatory compliance, fostering an environment of trust and respect. Key Stakeholders Chapter Area Lead Sub-domain Tech Lead Domain Architect Business Leads / Product owners Other Responsibilities Champion the company's broader mission and values, integrating them into daily operations and team ethos. Undertake additional responsibilities as necessary, ensuring they contribute to the organisation's strategic aims and adhere to Group and other Relevant policies. \ Qualification Requirements & Skills Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or related field, with preference given to advanced degrees. 10 years of professional Java development experience, including a proven record in backend system architecture and API design. At least 5 years in a leadership role managing diverse development teams and spearheading complex Java projects. Proficiency in a range of Java frameworks such as Spring, Spring Boot, and Hibernate, and an understanding of Apache Struts. Proficient in Java, with solid expertise in core concepts like object-oriented programming, data structures, and complex algorithms. Knowledgeable in web technologies, able to work with HTTP, RESTful APIs, JSON, and XML Expert knowledge of relational databases such as Oracle, MySQL, PostgreSQL, and experience with NoSQL databases like MongoDB, Cassandra is a plus Familiarity with DevOps tools and practices, including CI/CD pipeline deployment, containerisation technologies like Docker and Kubernetes, and cloud platforms such as AWS, Azure, or GCP. Solid grasp of front-end technologies (HTML, CSS, JavaScript) for seamless integration with backend systems. Strong version control skills using tools like Git / Bitbucket with a commitment to maintaining high standards of code quality through reviews and automated tests. Exceptional communication and team-building skills, with the capacity to mentor developers, facilitate technical skill growth, and align team efforts with strategic objectives. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to work effectively in a fast-paced, dynamic environment. Role Specific Technical Competencies Hands-on Java Development Leadership in System Architecture Database Proficiency CI / CD Container Platforms – Kubernetes / OCP / Podman About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 6 days ago

Apply

6.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Open Location - Indore, Noida, Gurgaon, Bangalore, Hyderabad, Pune Job Description 6-9 years experience working on Data engineering & ETL/ELT processes, data warehousing, and data lake implementation with AWS services or Azure services. Hands on experience in designing and implementing solutions like creating/deploying jobs, Orchestrating the job/pipeline and infrastructure configurations Expertise in designing and implementing pySpark and Spark SQL based solutions Design and implement data warehouses using Amazon Redshift, ensuring optimal performance and cost efficiency. Good understanding of security, compliance, and governance standards. Roles & Responsibilities Design and implement robust and scalable data pipelines using AWS or Azure services Drive architectural decisions for data solutions on AWS, ensuring scalability, security, and cost-effectiveness. Hands-on experience of Develop and deploy ETL/ELT processes using Glue/Azure data factory, Lambda/Azure functions, Step function/Azure logic apps/MWAA, S3 and Lake formation from various data sources. Strong Proficiency in pySpark, SQL, Python. Proficiency in SQL for data querying and manipulation. Experience with data modelling, ETL processes, and data warehousing concepts. Create and maintain documentation for data pipelines, processes, and following best practices. Knowledge of various Spark Optimization technique, Monitoring and Automation would be a plus. Participate in code reviews and ensure adherence to coding standards and best practices. Understanding of data governance, compliance, and security best practices. Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills – with understanding on stakeholder mapping Mandatory Skills - AWS OR Azure Cloud, Python Programming, SQL, Spark SQL, Hive, Spark optimization techniques and Pyspark. Share resume at sonali.mangore@impetus.com with details (CTC, Expected CTC, Notice Period)

Posted 6 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are hiring for one the IT product-based company Job Title: - Senior Data Engineer Exp-5+ years Location: - Gurgaon/Pune Work Mode: - Hybrid Skills: - Azure and Databricks Programming Language- Python, Powershell, .Net/Java are plus What you will do Participate in design and developed highly performing and scalble large-scale Data and Analytics products Participate in requirements grooming, analysis and design discussions with fellow developers, architects and product analysts Participate in product planning by providing estimates on user stories Participate in daily standup meeting and proactively provide status on tasks Develop high-quality code according to business and technical requirements as defined in user stories Write unit tests that will improve the quality of your code Review code for defects and validate implementation details against user stories Work with quality assurance analysts who build test cases that validate your work Demo your solutions to product owners and other stakeholders Work with other Data and Analytics development teams to maintain consistency across the products by following standards and best software development practices Provide third tier support for our product suite What you will bring 3+ years of Data Engineering and Analytics experience 2+ years of Azure and Databricks (or Apache Sparks, Hadoop and Hive) working experience Knowledge and application of the following technical skills: T-SQL/PL-SQL, PySpark, Azure Data Factory, Databricks (or Apache Sparks, Hadoop and Hive), and Power BI or equivalent Business Intelligence tools Understanding of dimension modeling and Data Warehouse concepts Programming skills such as Python, PowerShell, .Net/Java are plus Git repository experience and thorough understanding of branching and merging strategies. 2 years' experience developing in Agile Software Development Life Cycle and Scrum methodology Strong planning, and time management skills Advanced problem-solving skills and data driven Excellent written and oral communication skills Team player who fosters an environment of shared success, is passionate about always learning and improving, self-motivated, open minded, and creative What we would like to see Bachelor's degree in computer science or related field Healthcare knowledge is a plus

Posted 6 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position: MicroStrategy Engineer Experience: 5+ Years Locations: Pune, India Time type: Contract payroll : airisDATA Job Description: We are looking for a seasonal MicroStrategy Engineer · 5+ years’ experience in data and reporting technologies · knowledge and hands on experience in platforms and tools like MicroStrategy and other reporting tools · MicroStrategy skills; development of reports, cubes, performance improvements · role includes requirement gathering and completing end-to-end projects · leveraging cloud-native technologies to build and deploy data pipelines · implementation of MicroStrategy on cloud/azure · need working knowledge of Linux, oracle, hive, impala · Postgres is a plus, but not mandatory · banking domain expertise is a plus as well, but not mandatory · administration skills are a plus

Posted 6 days ago

Apply

0.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Designation: Senior Analyst Level: L2 Experience: 4 to 7 years Location: Chennai Job Description: We are seeking a highly skilled and motivated results-driven Senior Analyst with 4+ years of experience to join a fast-paced collaborative team at LatentView Analytics working in the financial services domain. Responsibilities: Drive measurement strategy and lead E2E process of A/B testing for areas of web optimization such as landing pages, user funnel, navigation, checkout, product lineup, pricing, search and monetization opportunities. Analyze web user behavior at both visitor and session level using clickstream data by anchoring to key web metrics and identify user behavior through engagement and pathing analysis Leverage AI/GenAI tools for automating tasks and building custom implementations Use data, strategic thinking and advanced scientific methods including predictive modeling to enable data-backed decision making for Intuit at scale Measure performance and impact of various product releases Demonstrate strategic thinking and systems thinking to solve business problems and influence strategic decisions using data storytelling. Partner with GTM, Product, Engineering, Design, Engineering teams to drive analytics projects end to end Build models to identify patterns in traffic and user behavior to inform acquisition strategies and optimize for business outcomes Skills: +5 years of experience working in web, product, marketing, or other related analytics fields to solve for marketing/product business problems +4 years of experience in designing and executing experiments (A/B and multivariate) with a deep understanding of the stats behind hypothesis testing Proficient in alternative A/B testing methods like DiD, Synthetic control and other causal inference techniques +5 years of technical proficiency in SQL, Python or R and data visualization tools like tableau +5 years of experience in manipulating and analyzing large complex datasets (e.g. clickstream data), constructing data pipelines (ETL) and working on big data technologies (e.g., Redshift, Spark, Hive, BigQuery) and solutions from cloud platforms and visualization tools like Tableau +3 years of experience in web analytics, analyzing website traffic patterns and conversion funnels +5 years of experience in building ML models (eg: regression, clustering, trees) for personalization applications Demonstrate ability to drive strategy, execution and insights for AI native experiences across the development lifecycle (ideation, discovery, experimentation, scaling) Outstanding communication skills with both technical and non-technical audiences Ability to tell stories with data, influence business decisions at a leadership level, and provide solutions to business problems Ability to manage multiple projects simultaneously to meet objectives and key deadlines Job Snapshot Updated Date 28-07-2025 Job ID J_3917 Location Chennai, Tamil Nadu, India Experience 4 - 7 Years Employee Type Permanent

Posted 6 days ago

Apply

0.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Location Bengaluru, Karnataka, India Job ID R-232528 Date posted 28/07/2025 Job Title: Analyst – Data Engineer Introduction to role: Are you ready to make a difference in the world of data science and advanced analytics? As a Data Engineer within the Commercial Strategic Data Management team, you'll play a pivotal role in transforming data science solutions for the Rare Disease Unit. Your mission will be to craft, develop, and deploy data science solutions that have a real impact on patients' lives. By leveraging cutting-edge tools and technology, you'll enhance delivery performance and data engineering capabilities, creating a seamless platform for the Data Science team and driving business growth. Collaborate closely with the Data Science and Advanced Analytics team, US Commercial leadership, Sales Field Team, and Field Operations to build data science capabilities that meet commercial needs. Are you ready to take on this exciting challenge? Accountabilities: Collaborate with the Commercial Multi-functional team to find opportunities for using internal and external data to enhance business solutions. Work closely with business and advanced data science teams on cross-functional projects, delivering complex data science solutions that contribute to the Commercial Organization. Manage platforms and processes for complex projects using a wide range of data engineering techniques in advanced analytics. Prioritize business and information needs with management; translate business logic into technical requirements, such as creating queries, stored procedures, and scripts. Interpret data, process it, analyze results, present findings, and provide ongoing reports. Develop and implement databases, data collection systems, data analytics, and strategies that optimize data efficiency and quality. Acquire data from primary or secondary sources and maintain databases/data systems. Identify and define new process improvement opportunities. Manage and support data solutions in BAU scenarios, including data profiling, designing data flow, creating business alerts for fields, and query optimization for ML models. Essential Skills/Experience: BS/MS in a quantitative field (Computer Science, Data Science, Engineering, Information Systems, Economics) 5+ years of work experience with DB skills like Python, SQL, Snowflake, Amazon Redshift, MongoDB, Apache Spark, Apache Airflow, AWS cloud and Amazon S3 experience, Oracle, Teradata Good experience in Apache Spark or Talend Administration Center or AWS Lambda, MongoDB, Informatica, SQL Server Integration Services Experience in building ETL pipeline and data integration Build efficient Data Management (Extract, consolidate and store large datasets with improved data quality and consistency) Streamlined data transformation: Convert raw data into usable formats at scale, automate tasks, and apply business rules Good written and verbal skills to communicate complex methods and results to diverse audiences; willing to work in a cross-cultural environment Analytical mind with problem-solving inclination; proficiency in data manipulation, cleansing, and interpretation Experience in support and maintenance projects, including ticket handling and process improvement Setting up Workflow Orchestration (Schedule and manage data pipelines for smooth flow and automation) Importance of Scalability and Performance (handling large data volumes with optimized processing capabilities) Experience with Git Desirable Skills/Experience: Knowledge of distributed computing and Big Data Technologies like Hive, Spark, Scala, HDFS; use these technologies along with statistical tools like Python/R Experience working with HTTP requests/responses and API REST services Familiarity with data visualization tools like Tableau, Qlik, Power BI, Excel charts/reports Working knowledge of Salesforce/Veeva CRM, Data governance, and Data mining algorithms Hands-on experience with EHR, administrative claims, and laboratory data (e.g., Prognos, IQVIA, Komodo, Symphony claims data) Good experience in consulting, healthcare, or biopharmaceuticals When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca's Alexion division, you'll find an environment where your work truly matters. Embrace the opportunity to grow and innovate within a rapidly expanding portfolio. Experience the entrepreneurial spirit of a leading biotech combined with the resources of a global pharma. You'll be part of an energizing culture where connections are built to explore new ideas. As a member of our commercial team, you'll meet the needs of under-served patients worldwide. With tailored development programs designed for skill enhancement and fostering empathy for patients' journeys, you'll align your growth with our mission. Supported by exceptional leaders and peers across marketing and compliance, you'll drive change with integrity in a culture celebrating diversity and innovation. Ready to make an impact? Apply now to join our team! Date Posted 29-Jul-2025 Closing Date 04-Aug-2025 Alexion is proud to be an Equal Employment Opportunity and Affirmative Action employer. We are committed to fostering a culture of belonging where every single person can belong because of their uniqueness. The Company will not make decisions about employment, training, compensation, promotion, and other terms and conditions of employment based on race, color, religion, creed or lack thereof, sex, sexual orientation, age, ancestry, national origin, ethnicity, citizenship status, marital status, pregnancy, (including childbirth, breastfeeding, or related medical conditions), parental status (including adoption or surrogacy), military status, protected veteran status, disability, medical condition, gender identity or expression, genetic information, mental illness or other characteristics protected by law. Alexion provides reasonable accommodations to meet the needs of candidates and employees. To begin an interactive dialogue with Alexion regarding an accommodation, please contact accommodations@Alexion.com. Alexion participates in E-Verify.

Posted 6 days ago

Apply

2.0 - 31.0 years

3 - 4 Lacs

Work From Home

Remote

Here’s a well-structured list of Roles & Responsibilities for a Senior Flutter Developer who can take your idea and turn it into a full-fledged app ready for Play Store and App Store deployment. This role assumes full-stack mobile app ownership and expertise across architecture, development, deployment, and maintenance. 🚀 Senior Flutter Developer - Roles & Responsibilities ✅ Core Responsibilities End-to-End App Development Translate product ideas, mockups, or wireframes into functional, high-performance Flutter apps. Build apps for both Android and iOS using a single codebase. Architecture & Project Planning Design app architecture: state management (e.g., Riverpod, Bloc), clean code structure, scalable folder structures. Set up modular codebases for maintainability and team collaboration. Break down high-level product requirements into technical tasks. API Integration & Backend Communication Consume RESTful APIs, GraphQL, or Firebase services. Implement error handling, loading states, and offline-first strategies where needed. Authentication & Security Implement login/signup with phone/email/Google/Apple. Setup Firebase Auth / AWS Cognito / custom auth flows. Secure data storage (e.g., SharedPreferences, Hive, EncryptedStorage). Database & Storage Local databases: Hive, Drift, SQLite. Cloud databases: Firebase Firestore, Realtime DB, Supabase, or custom backend. State Management Use and recommend best practices for scalable state management (Riverpod preferred for futureproofing). Third-party SDKs & Integrations Integrate SDKs: payments (e.g., Razorpay, Stripe), push notifications (Firebase Messaging), analytics, deep linking, maps, etc. UI/UX Implementation Create pixel-perfect, responsive UIs from Figma or other design tools. Add animations using Rive, Lottie, or Flutter’s native animation tools. Testing & Quality Assurance Write unit, widget, and integration tests. Ensure app stability using CI/CD and crash reporting tools (e.g., Sentry, Firebase Crashlytics). Deployment Prepare apps for Google Play Store and Apple App Store with all necessary compliance. Handle signing, provisioning profiles, and release builds. Setup CI/CD for automated builds (e.g., with Codemagic, Bitrise, GitHub Actions). 👨‍💼 Team & Communication Collaborate with product manager, UI/UX designer, and backend developers. Participate in agile sprints, daily standups, and sprint planning. Convert product vision into tech specs, timelines, and deliverables. 🧠 Required Expertise 3–5+ years Flutter experience (Dart, Widgets, CustomPainter, Platform Channels). Prior experience launching apps to both Play Store and App Store. Deep understanding of mobile architecture (clean architecture, MVVM, hexagonal, etc.). Familiarity with performance optimization, lazy loading, and memory management. Proficient in Git, GitHub, GitLab, Bitbucket workflows. Experience with cloud (Firebase, AWS Amplify) is a plus. 💡 Bonus Skills Native Android (Kotlin) / iOS (Swift) experience for bridging platform features. DevOps experience (CI/CD pipelines, release automation). Experience working in a startup or building MVPs from scratch. Familiarity with tools like Figma, Notion, Jira, Postman. 📦 Expected Deliverables Production-ready Flutter app (Android + iOS). Complete source code with documentation. Deployed app live on both stores. Basic analytics, crash reporting, and CI/CD setup. Post-release support for bug fixing and iteration.

Posted 6 days ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team Roku runs one of the largest data lakes in the world. We store over 70 PB of data, run 10+M queries per month, scan over 100 PB of data per month. Big Data team is the one responsible for building, running, and supporting the platform that makes this possible. We provide all the tools needed to acquire, generate, process, monitor, validate and access the data in the lake for both streaming data and batch. We are also responsible for generating the foundational data. The systems we provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is actively involved in the Open Source, and we are planning to increase our engagement over time. About the Role Roku is in the process of modernizing its Big Data Platform. We are working on defining the new architecture to improve user experience, minimize the cost and increase efficiency. Are you interested in helping us build this state-of-the-art big data platform? Are you an expert with Big Data Technologies? Have you looked under the hood of these systems? Are you interested in Open Source? If you answered “Yes” to these questions, this role is for you! What you will be doing You will be responsible for streamlining and tuning existing Big Data systems and pipelines and building new ones. Making sure the systems run efficiently and with minimal cost is a top priority You will be making changes to the underlying systems and if an opportunity arises, you can contribute your work back into the open source You will also be responsible for supporting internal customers and on-call services for the systems we host. Making sure we provided stable environment and great user experience is another top priority for the team We are excited if you have 7+ years of production experience building big data platforms based upon Spark, Trino or equivalent Strong programming expertise in Java, Scala, Kotlin or another JVM language. A robust grasp of distributed systems concepts, algorithms, and data structures Strong familiarity with the Apache Hadoop ecosystem: Spark, Kafka, Hive/Iceberg/Delta Lake, Presto/Trino, Pinot, etc. Experience working with at least 3 of the technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Trino, Flink, Airflow, Druid, Hive, Iceberg, Delta Lake, Pinot, Storm etc Extensive hands-on experience with public cloud AWS or GCP BS/MS degree in CS or equivalent AI Literacy / AI growth mindset Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies