Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 4.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Fusion Plus Solutions Inc is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Tech Stalwart Solution Private Limited is looking for Sr. Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 weeks ago
2.0 - 4.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Locations : Bengaluru | Gurgaon Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG's X team, you will work closely with consulting teams on a diverse range of advanced analytics and engineering topics. You will have the opportunity to leverage analytical methodologies to deliver value to BCG's Consulting (case) teams and Practice Areas (domain) through providing analytical and engineering subject matter expertise.As a Data Engineer, you will play a crucial role in designing, developing, and maintaining data pipelines, systems, and solutions that empower our clients to make informed business decisions. You will collaborate closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to deliver high-quality data solutions that meet our clients' needs. YOU'RE GOOD AT Delivering original analysis and insights to case teams, typically owning all or part of an analytics module whilst integrating with a case team. Design, develop, and maintain efficient and robust data pipelines for extracting, transforming, and loading data from various sources to data warehouses, data lakes, and other storage solutions. Building data-intensive solutions that are highly available, scalable, reliable, secure, and cost-effective using programming languages like Python and PySpark. Deep knowledge of Big Data querying and analysis tools, such as PySpark, Hive, Snowflake and Databricks. Broad expertise in at least one Cloud platform like AWS/GCP/Azure.* Working knowledge of automation and deployment tools such as Airflow, Jenkins, GitHub Actions, etc., as well as infrastructure-as-code technologies like Terraform and CloudFormation. Good understanding of DevOps, CI/CD pipelines, orchestration, and containerization tools like Docker and Kubernetes. Basic understanding on Machine Learning methodologies and pipelines. Communicating analytical insights through sophisticated synthesis and packaging of results (including PPT slides and charts) with consultants, collecting, synthesizing, analyzing case team learning & inputs into new best practices and methodologies. Communication Skills Strong communication skills, enabling effective collaboration with both technical and non-technical team members. Thinking Analytically You should be strong in analytical solutioning with hands on experience in advanced analytics delivery, through the entire life cycle of analytics. Strong analytics skills with the ability to develop and codify knowledge and provide analytical advice where required. What You'll Bring Bachelor's / Master's degree in computer science engineering/technology At least 2-4 years within relevant domain of Data Engineering across industries and work experience providing analytics solutions in a commercial setting. Consulting experience will be considered a plus. Proficient understanding of distributed computing principles including management of Spark clusters, with all included services - various implementations of Spark preferred. Basic hands-on experience with Data engineering tasks like productizing data pipelines, building CI/CD pipeline, code orchestration using tools like Airflow, DevOps etc.Good to have:- Software engineering concepts and best practices, like API design and development, testing frameworks, packaging etc. Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge on web development technologies. Understanding of different stages of machine learning system design and development #BCGXjob Who You'll Work With You will work with the case team and/or client technical POCs and border X team. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
13 - 22 Lacs
Bengaluru
Work from Office
Job Opportunity: Senior Data Analyst Bangalore Location: Bangalore, India Company: GSPANN Technologies Apply: Send your resume to heena.ruchwani@gspann.com GSPANN is hiring a Senior Data Analyst with 57 years of experience to join our dynamic team in Bangalore! What Were Looking For: Education: Bachelor’s degree in Computer Science, MIS, or a related field Experience: 5–7 years in data analysis, with a strong ability to translate business strategy into actionable insight Advanced SQL expertise Proficiency in Tableau , Power BI , or Domo Experience with AWS , Hive , Snowflake , Presto Ability to define and track KPIs across domains like Sales, Consumer Behavior, and Supply Chain Strong problem-solving skills and attention to detail Excellent communication and collaboration abilities Experience working in Agile environments Retail or eCommerce domain experience is a plus If this sounds like the right fit for you, don’t wait— send your updated resume to heena.ruchwani@gspann.com today!
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Work with the business teams to gather user requirements. Design and develop Talend ETL jobs, SQL queries based on signed-off requirements. Work on production support issues in coordination with the production support team. Periodic issue analysis to identify issues and find root causes. Coordinate with business users to support ad-hoc report development/change requests. Experience primarily with Talend ETL development and version upgrade, migration, tool replacement projects Good Unix shell scripting skills Experience in RDBMS, preferably Oracle with SQL query writing skills. Good understating of Data-warehousing concepts like Schema, Facts/Dimensions. Familiarity with identification and resolution of data quality issues. Has the ability to operate with a limited level of direct supervision. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Strong and effective inter-personal and communication skills and the ability to interact professionally with a business user. Great team player with a passion to collaborate with colleagues. Qualifications: Bachelor’s degree/University degree or equivalent experience 5-8 years of relevant experience in the Financial Service industry Intermediate level experience in Applications Development role Consistently demonstrates clear and concise written and verbal communication Demonstrate problem-solving and decision-making skills Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience Must Have Skills: Experience primarily with Talend ETL development and migration projects Experience with any one other ETL tools - IBM Data stage/Abinitio/Spark Experience with SAP Business object or any other Business Intelligence tools like Tableau/COGNOS (secondary) Good Unix shell scripting skills Experience in RDBMS, preferably Oracle with SQL query writing skills. Good understating of Data-warehousing concepts like Schema, Facts/Dimensions. Familiarity with identification and resolution of data quality issues. Strong and effective inter-personal and communication skills and the ability to interact professionally with a business user. Great team player with a passion to collaborate with colleagues. Knowledge of any application server (Weblogic, WAS, Tomcat etc) This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Good to have Skills: Good understanding of data migration, version upgrade, tool replacement projects Good Understanding of Bigdata and Hadoop ecosystem Apache Spark with java Good Understanding of Hive and Impala Testing frameworks (test driven development) Good communication skills Knowledge of Maven, Python scripting skills Good problem-solving skills Beneficial: EMS, Kafka, Domain Knowledge ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Streaming data Technical skills requirements :- Experience- 5+ Years Solid hands-on and Solution Architecting experience in Big-Data Technologies (AWS preferred) - Hands on experience in: AWS Dynamo DB, EKS, Kafka, Kinesis, Glue, EMR - Hands-on experience of programming language like Scala with Spark. - Good command and working experience on Hadoop Map Reduce, HDFS, Hive, HBase, and/or No-SQL Databases - Hands on working experience on any of the data engineering analytics platform (Hortonworks Cloudera MapR AWS), AWS preferred - Hands-on experience on Data Ingestion Apache Nifi, Apache Airflow, Sqoop, and Oozie - Hands on working experience of data processing at scale with event driven systems, message queues (Kafka Flink Spark Streaming) Hands-on development experience on above technologies - Data Warehouse exposure on Apache Nifi, Apache Airflow, Kylo - Operationalization of ML models on AWS (e.g. deployment, scheduling, model monitoring etc.) - Feature Engineering Data Processing to be used for Model development - Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) - Experience building data pipelines for structured unstructured, real-time batch, events synchronous asynchronous using MQ, Kafka, Steam processing - Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data - Must be very strong in writing SQL queries. Show more Show less
Posted 2 weeks ago
4.0 - 9.0 years
3 - 8 Lacs
Pune
Work from Office
Design, develop, and maintain ETL pipelines using Informatica PowerCenter or Talend to extract, transform, and load data into EDW systems and data lake. Optimize and troubleshoot complex SQL queries and ETL jobs to ensure efficient data processing and high performance. Technologies - SQL, Informatica Power center, Talend, Big Data, Hive
Posted 2 weeks ago
4.0 - 9.0 years
7 - 17 Lacs
Hyderabad
Hybrid
Mega Walkin Drive for Senior Software Engineer- Informatica, Teradata, SQL Your future duties and responsibilities: Job Summary: CGI is seeking a skilled and detail-oriented Informatica Developer to join our data engineering team. The ideal candidate will be responsible for designing, developing, and implementing ETL (Extract, Transform, Load) workflows using Informatica PowerCenter (or Informatica Cloud), as well as optimizing data pipelines and ensuring data quality and integrity across systems. Key Responsibilities : Develop, test, and deploy ETL processes using Informatica PowerCenter or Informatica Cloud. Work with business analysts and data architects to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based platforms. Create and maintain technical documentation for ETL processes and data flows. Optimize existing ETL workflows for performance and scalability. Troubleshoot and resolve ETL and data-related issues in a timely manner. Implement data validation, transformation, and cleansing techniques. Collaborate with QA teams to support data testing and verification.Ensure compliance with data governance and security policies. Required qualifications to be successful in this role: Minimum 4 years of experience with Informatica PowerCenter or Informatica Cloud. Proficiency in SQL and experience with databases like Oracle, SQL Server, Snowflake, or Teradata. Strong understanding of ETL best practices and data integration concepts. Experience with job scheduling tools like Autosys, Control-M, or equivalent. Knowledge of data warehousing concepts and dimensional modeling. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Good to have Python or any programming knowledge. Bachelors degree in Computer Science, Information Systems, or related field. Preferred Qualifications: Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Bigdata/ Hadoop tools (e.g., Spark, Hive) and modern data architectures.Informatica certification is a plus. Experience with Agile methodologies and DevOps practices. Shift Timings : Shift: General Shift (5 Days WFO for initial 8 weeks) Skills: Data Engineering Hadoop Hive Python SQL Teradata Notice Period- 0-45 Days Pre requisites : Aadhar Card a copy, PAN card copy, UAN Disclaimer : The selected candidates will initially be required to work from the office for 8 weeks before transitioning to a hybrid model with 2 days of work from the office each week.
Posted 2 weeks ago
8.0 - 13.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Senior Software Engineer Location: Bengaluru As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years.
Posted 2 weeks ago
175.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? We are looking for a Manager ICS Complaint Program Reporting and Insights with specific focus on establishing the Reporting and Business Insights workstream for Complaints Program in line with the requirements of AEMP71. It will involve extensive collaboration with multiple partners across Global Servicing Group, International markets and legal entities and ICS Control Management. The Manager – ICS Complaint Program - Reporting and Insights will: Lead and develop the ICS Complaints Reporting and Insights program Will establish the analytics, insights, and regulatory reporting for ICS Complaint’s Program Collaborate directly with senior leaders to help them understand complaints trends and how they can respond to them. Identify complaint themes leveraging data insights and referring them to ICS and LE leadership as appropriate Proactively analyze risk trends, undertake root-cause analysis, and provide consultative support to business and stakeholders. Ensure all regulatory requests are managed with 100% accuracy and timeliness. The Manager, Complaints Reporting and Insights will: Design, build and maintain dashboards and automated reports leveraging ICS Complaints data Analyze complaint data to help identify root cause, areas of concern and potential issues Compile thematic risk reporting (levels, trends, causes) to provide actionable and meaningful insights into business on current risk levels, emerging trends, and root causes Translate complex data into concise, impactful visualizations and presentations for decision making Proactively identify opportunities to improve data quality, reporting processes and analytical capabilities Utilize Natural Language Processing and generative AI tools to automate report generation, summarize data insights and improve data storytelling Collaborate with stakeholders to define KPIs, reporting needs and performance metrics Research and implement AI driven BI innovations to continuously enhance business insights and reporting best practices Required Qualifications: 8+ years of experience in Data Analytics, generating Business Insights or similar role Proficient analytical and problem-solving skills, with an ability to analyze data, identify trends, and evaluate risk scenarios effectively Hands-on experience with Python, R, Tableau Developer or Tableau Desktop Certified Professional, Power BI, Cornerstone, SQL, HIVE, Advance MS Excel (Macros, Pivots). Hands on experience with AI/ ML frameworks, NLP, Sentiment analysis and Text summarization etc. Strong analytical, critical thinking and problem-solving skills Ability to communicate complex findings clearly to both technical and non-technical audiences Preferred Qualifications: Bachelor’s degree in business, Risk Mgmt, Statistics, Computer Science, or related field; advanced degrees (e.g., MBA, MSc) or certifications are advantageous Experience in at least one of the following: Providing identification of operational risks throughout business processes and systems Enhancing risk assessments and associated methodologies Reviewing and creating thematic risk reporting to provide actionable insights into risk levels, emerging trends, and root causes Experience in the financial services industry Experience in Big Data, Data Science will be a definite advantage Familiarity with ERP systems or business process tools Knowledge of predictive analytics Show more Show less
Posted 2 weeks ago
3.0 - 6.0 years
13 - 18 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s Platform Developmentteam designs, implements, tests and supports ZS’s ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products & analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do : Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality productsautomated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring : Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Understanding to Data Science Algorithms God to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 2 weeks ago
4.0 - 9.0 years
13 - 18 Lacs
Bengaluru
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s India Capability & Expertise Center (CEC) houses more than 60% of ZS people across three offices in New Delhi, Pune and Bengaluru. Our teams work with colleagues around the world to deliver real-world solutions to the clients who drive our business. The CEC maintains standards of analytical, operational and technologic excellence to deliver superior results to our clients. ZS’s Beyond Healthcare Analytics (BHCA) Team is shaping one of the key growth vectors for ZS. Beyond Healthcare engagements are comprised of clients from industries like Quick service restaurants, Technology, Food & Beverage, Hospitality, Travel, Insurance, Consumer Products Goods & other such industries across North America, Europe & South East Asia region. BHCA India team currently has presence across New Delhi, Pune and Bengaluru offices and is continuously expanding further at a great pace. BHCA India team works with colleagues across clients and geographies to create and deliver real world pragmatic solutions leveraging AI SaaS products & platforms, Generative AI applications, and other Advanced analytics solutions at scale. WhatYou’llDo Design and implement highly available data pipelines using spark and other big data technologies Work with data science team to develop new features to increase model accuracy and performance Create standardized data models to increase standardization across client deployments Troubleshooting and resolve issues in existing ETL pipelines. Complete proofs of concept to demonstrate capabilities and connect to new data sources Instill best practices for software development, ensure designs meet requirements, and deliver high-quality work on schedule. Document application changes and development updates. WhatYou’llBring A master’s or bachelor’s degree in computer science or related field from a top university. 4+ years' overall experience; 2+ years’ experience in data engineering using Apache Spark and SQL. 2+ years of experience in building and leading a strong data engineering team. Experience with full software lifecycle methodology, including coding standards, code reviews, source control management, build processes, testing, and operations. In-depth knowledge of python, sql, pyspark, distributed computing, analytical databases and other big data technologies. Strong knowledge of one or more cloud environments such as aws, gcp, and azure. Familiarity with the data science and machine learning development process Familiarity with orchestration tools such as Apache Airflow Strong analytical skills and the ability to develop processes and methodologies. Experience working with cross-functional teams, including UX, business (e.g. Marketing, Sales), product management and/or technology/IT/engineering) is a plus. Characteristics of a forward thinker and self-starter that thrives on new challenges and adapts quickly to learning new knowledge. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Mumbai
Work from Office
locationsMumbaiposted onPosted 30+ Days Ago job requisition idR-044656 About the Job The Red Hat Sales team is looking for an experienced Account Solutions Architect to join us in Mumbai, India. In this role, you will provide the first major experience our customers have with Red Hat while creating possibilities, solving problems, and establishing working relationships. You'll discover and analyze the business and technical needs of our customers, while collaborating with the Sales and Technical Delivery teams to help them invest wisely in the best solutions that will give their systems maximum flexibility, allowing them to run faster and more efficiently. You'll need to have extensive technical expertise, passion for open source, a thorough understanding of business processes, and the ability to identify and solve issues at the enterprise level. As an Account Solutions Architect, you will also need to have great communication and people skills. What will you do Develop strategic relationships with our customers to become a trusted adviser for Red Hat's offerings and solutions Demonstrate ownership of the technical relationships and technical sales cycle within a set of named accounts in your territory Ensure revenue and new business quotas/targets and service objectives are met while maintaining a high level of satisfaction among prospective and existing customer Provide presales technical support to our Enterprise Sales team Support evaluations of our offerings and technical proofs of concepts Respond to customer and partner inquiries, including requests for proposal (RFPs) and requests for information (RFIs) Provides pre-sales technical support for the development and implementation of complex solutions. Use in-depth domain & product knowledge to provide technical expertise to customers or partners through sales presentations, product demonstrations, workshop, evaluations and Proof of Concept/Technology (POCs/POTs) Assess potential application of company products to meet customer needs and prepare detailed product specifications for the development and implementation of customer solutions. Create detailed design and implementation specifications for complex products/applications/solutions. Provide consultation to prospective users/customers/partners on product capability assessment and validation What will you bring 10+ years of experience in the IT industry 5+ years of experience working as a presales engineer, consultant, IT architect, or equivalent supporting partners and enterprises 5+ years experience working in BFSI industry 5+ years of experience with solutions design or implementation complex application systems, cloud, multi-datacenter, and modernizing application environments, as well as multi-product integration Experience in Application Modernization, Digital Transformation and understanding of modern methodologies like Kubernetes, Containers & Microservices Architecture, agile development, and DevSecOps and associated capabilities like automation, orchestration, and configuration management Ability to explain technical concepts to non-technical audiences Familiarity with enterprise solutions and architectures, including cloud, big data, virtualization, storage, middleware, clustering, and high availability Excellent presentation skills; ability to present to small and large groups of mixed business, technical, management, and leadership audiences Record of developing relationships at engineering, commercial, and executive levels throughout large enterprise IT organizations Understanding of complex enterprise solutions and architectures Ability to work well in a team environment and collaborate with others to provide the best solutions Knowledge of sophisticated sales motions Willingness to travel up to 50% of the time Record of working with partners, distributors, consultants, and service partners to create solutions propositions around Red Hat's solutions Expertise in one or more offerings from the Red Hat portfolio like OpenShift, Ansible, RHEL, JBOSS, Application Services/Middleware The following are considered a plus Ability to handle multiple priorities and manage multiple large transactions between multiple organizations Experience working as an enterprise architect and strategizing with C-level users regarding technologies and roadmaps Red Hat Certified Architect (RHCA), Red Hat Certified Engineer (RHCE), VMware Certified Professional (VCP), or Information Technology Infrastructure Library (ITIL) certifications About Red Hat is the worlds leading provider of enterprise software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact.
Posted 2 weeks ago
5.0 - 10.0 years
18 - 22 Lacs
Bengaluru
Work from Office
locationsBangalore - Carinaposted onPosted 30+ Days Ago job requisition idR-046193 About the Job: The Red Hat engineering team is looking for a software engineer to work for the world's leading enterprise Linux platform, Red Hat Enterprise Linux (RHEL) on hybrid cloud platforms. In this role, you will help develop and implement cutting-edge new technologies and features, fix product issues in the RHEL operating system across various virtualization and cloud platforms. And you will have the opportunity to work with brilliant engineers from all over the world, and collaborate with Red Hat's partners, and communities in an open source and agile development method. What will you do Maintain and update packages in RHEL as VM on virtualization and cloud platforms, implement new features, and fix issues. Help others review and refine code. Plan and prioritize your work to complete timely in the RHEL development cycle. Collaborate with the quality engineering team ensuring product quality, help them understand requirements and develop test plans. Work together with the support team to get customer issues resolved. Cooperate with virtualization/cloud partners, follow and understand their new features and requirements. Work with upstream communities, contribute your code to upstream. What will you bring Bachelor's degree or above in computer science related major 5+ years of solid Linux experience, best to have understanding of Linux components (kernel, bootloader, memory, network, storage, graphics etc.) 5+ years of professional experience in software development, be familiar with Python/C/Shell scripting Ability to troubleshoot and solve problems independently Be self-motivated, responsible, and collaborative Proficient in English reading, writing and speaking The following are considered as a plus: Experience with Linux Shell / Python / Ansible / PowerShell Experience with virtualization (KVM, VMware, Hyper-V, OpenStack etc.), or cloud platform (Azure, AWS, Google, etc.) Experience with open source development and git #LI-AK1 About Red Hat is the worlds leading provider of enterprise software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact.
Posted 2 weeks ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Scala, Java, spark (Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components, , SQL,PostgreSQL , t-sql/pl-sql, Hadoop ( airflow, oozie, hdfs, Sqoop, Hive, Pig, Map Reduce),Shell Scripting, Cloud technologies GCP preferable Mandatory Skill Sets Scala, Spark, GCP Preferred Skill Sets Scala, Spark, GCP Years Of Experience Required 4 - 8 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 2 weeks ago
1.0 - 6.0 years
8 - 12 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Data Engineer - Data Engineering & Analytics What you'll do: Create and maintain optimal data pipeline architecture. Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for scalability. Design, develop and deploy high volume ETL pipelines to manage complex and near-real time data collection. Develop and optimize SQL queries and stored procedures to meet business requirements. Design, implement, and maintain REST APIs for data interaction between systems. Ensure performance, security, and availability of databases. Handle common database procedures such as upgrade, backup, recovery, migration, etc. Collaborate with other team members and stakeholders. Prepare documentations and specifications. What you'll bring: Bachelor’s degree in computer science, Information Technology, or related field 1+ years of experience SQL, TSQL, Azure Data Factory or Synapse or relevant ETL technology. Prepare documentations and specifications. Strong analytical skills (impact/risk analysis, root cause analysis, etc.) Proven ability to work in a team environment, creating partnerships across multiple levels. Demonstrated drive for results, with appropriate attention to detail and commitment. Hands-on experience with Azure SQL Database Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 2 weeks ago
2.0 - 5.0 years
13 - 17 Lacs
Hyderabad
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Develop/Convert the database (Hadoop to GCP) of the specific objects (tables, views, procedures, functions, triggers, etc.) from one database to another database platform Implementation of a specific Data Replication mechanism (CDC, file data transfer, bulk data transfer, etc.). Expose data as API Participation in modernization roadmap journey Analyze discovery and analysis outcomes Lead discovery and analysis workshops/playbacks Identification of the applications dependencies, source, and target database incompatibilities. Analyze the non-functional requirements (security, HA, RTO/RPO, storage, compute, network, performance bench, etc.). Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Leads the team to adopt right tools for various migration and modernization method Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 2 weeks ago
5.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on Azure Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Exposure to streaming solutions and message brokers like Kafka technologies Experience Unix / Linux Commands and basic work experience in Shell Scripting Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies. Redesign Control M Batch processing for the ETL job build to run efficiently in Production. Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow. Responsibilities: Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support. Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart." Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
Posted 2 weeks ago
2.0 years
0 Lacs
India
Remote
About YipitData: YipitData is the leading market research and analytics firm for the disruptive economy and recently raised up to $475M from The Carlyle Group at a valuation over $1B. We analyze billions of alternative data points every day to provide accurate, detailed insights on ridesharing, e-commerce marketplaces, payments, and more. Our on-demand insights team uses proprietary technology to identify, license, clean, and analyze the data many of the world’s largest investment funds and corporations depend on. For three years and counting, we have been recognized as one of Inc’s Best Workplaces . We are a fast-growing technology company backed by The Carlyle Group and Norwest Venture Partners. Our offices are located in NYC, Austin, Miami, Denver, Mountain View, Seattle , Hong Kong, Shanghai, Beijing, Guangzhou, and Singapore. We cultivate a people-centric culture focused on mastery, ownership, and transparency. We are hiring 2 software engineers (alternative title: backend engineer). One will join our Infrastructure Team, and the other will join our Data Feeds Team. This is a fully-remote opportunity based in India. Standard work hours are from 8 am to 5 pm IST. You are also expected to be flexible in working hours sometimes to participate in the US, China, India collaboration. As Our Software Engineer in the Infrastructure Team, You Will Build expertise in different email providers, such as gmail, outlook, yahoo, exchange etc. Work on email strategy adjustment, performance improvement, email storage, how to do high availability and scalability Be responsible for the email system’s data storage and related improvements. Understand business requirements and participate in discussions with the different stakeholders to design technical solutions. Be creative and study new technologies in the space, make sure high availability and scalability for email systems. Maintain existing service, working on iterative upgrades, deploy improvement, and service governance. As Our Software Engineer in the Data Feeds Team, You Will Develop, optimize, and maintain scalable data pipelines for structured and unstructured data processing. Maintain and enhance the stability, reliability, and high availability of existing data systems and services. Collaborate with the team to build and refine an expandable, high-performance data architecture. Partner with various teams to deliver high-quality, dependable data services to internal users. Understand product and business requirements to design and implement data functionalities, including intuitive data visualizations. Oversee the integration and maintenance of collaborative data with third-party clients, addressing their data mining and analytical requirements. Work closely with the US, Singapore and China team, adapting to flexible work hours as needed. Enforce best practices in data governance, security, and compliance to protect sensitive information. You Are Likely To Succeed If you have Bachelor's degree in Computer Science, or related majors, 2+ yrs backend experience. Solid computer foundation and programming skills, familiar with common data structures and algorithms. 2+ years experience in one of the following languages: Go/Python. Familiar with one of open source components:Mysql/Redis/Message Queue/Nosql. Experience in architecture and developing large-scale distributed systems.(for infrastructure team) Familiarity with one or more of the following: Spark, Hadoop, Hive, or Elasticsearch.(for data feeds team) Excellent logic analysis capabilities, able to abstract and split business logic reasonably. Familiarity with email protocols(IMAP/SMTP) is a plus. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life and we mean it. We offer vacation time, parental leave, team events, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal opportunity employer. Job Applicant Privacy Notice Show more Show less
Posted 2 weeks ago
6.0 - 11.0 years
18 - 25 Lacs
Hyderabad
Work from Office
SUMMARY Data Modeling Professional Location Hyderabad/Pune Experience: The ideal candidate should possess at least 6 years of relevant experience in data modeling with proficiency in SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools). Key Responsibilities: Develop and configure data pipelines across various platforms and technologies. Write complex SQL queries for data analysis on databases such as SQL Server, Oracle, and HIVE. Create solutions to support AI/ML models and generative AI. Work independently on specialized assignments within project deliverables. Provide solutions and tools to enhance engineering efficiencies. Design processes, systems, and operational models for end-to-end execution of data pipelines. Preferred Skills: Experience with GCP, particularly Airflow, Dataproc, and Big Query, is advantageous. Requirements Requirements: Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to deliver high-quality materials against tight deadlines. Effective under pressure with rapidly changing priorities. Note: The ability to communicate efficiently at a global level is paramount. --- Minimum 6 years of experience in data modeling with SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools). Proficiency in writing complex SQL queries for data analysis. Experience with GCP, particularly Airflow, Dataproc, and Big Query, is an advantage. Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to work effectively under pressure with rapidly changing priorities.
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
support team responsible for providing technical assistance to zeta clients and internal business functions. This is a customer-facing role, and requires excellent prioritization, responsiveness, and customer service, along with excellent verbal communication skills. Answering questions from customers about the features and capabilities of our Zeta Application products. Developing customer-facing documentation on using certain features on needed basis. Ensure that end-to-end display campaigns are run effectively, including tagging, trafficking, and optimization. Become a subject matter expert on Programmatic topics such as: platform functionality, campaign best practices, pixel implementation, creative troubleshooting, and more Provide technical support of Programmatic platforms, campaign performance, and external DSP tools Triage support tickets with issue summary, urgency, and next steps when input is needed from backend engineering teams Shift Timings: Night Shift (EST & PST) Education: BSC / BTech / MCA / MSC Must have Skills: Functional Skills and Experiences At least 3+ years experience in 24/7 environment providing technical support Extensive problem solving and debugging skills Excellent interpersonal and communication skills Flexible in working outside of core business hours at short notice Should have excellent written and verbal communication skills Experience of managing customers across locations/ geographies is preferred Deep knowledge of the programmatic ecosystem In-depth understanding of DSPs, programmatic advertising, real-time bidding, and ad operations. Demonstrated analytical ability Experience with troubleshooting ad delivery issues, pixel/tag implementation, and bid optimization. Experience using DSPs including (but not limited to): DoubleClick Bid Manager, The Trade Desk, and AppNexus. Zeta DSP a plus Deep understanding of Ad Tech industry and how Demand-Side Platforms (DSPs), Ad Servers, Attribution Platforms, etc. work in conjunction Technical Skills and Experiences: Strong MySQL/Oracle database with minimum 2 yrs. of work experience involving DB (MySQL, Vertica, HIVE) Good Knowledge with Hands on experience on Linux Operating system Web technologies & Networking Basics Good to Have: Certification in programmatic platforms (e.g., Google Marketing Platform, The Trade Desk).
Posted 2 weeks ago
3.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Roles & Responsibilities: Total Experience : 3 to 10 years Languages: Scala/Python 3.x File System: HDFS Frameworks: Spark 2.x/3.x (Batch/SQL API), Hadoop, Oozie/Airflow Databases: HBase, Hive, SQL Server, Teradata Version Control System: GitHub Other Tools: Zendesk, JIRA Mandatory Skill Sets Big Data, Python, Hadoop, Spark Preferred Skill Sets Big Data, Python, Hadoop, Spark Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Big Data Optional Skills Python (Programming Language) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 2 weeks ago
3.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Roles & Responsibilities: Total Experience : 3 to 10 years Languages: Scala/Python 3.x File System: HDFS Frameworks: Spark 2.x/3.x (Batch/SQL API), Hadoop, Oozie/Airflow Databases: HBase, Hive, SQL Server, Teradata Version Control System: GitHub Other Tools: Zendesk, JIRA Mandatory Skill Sets Big Data, Python, Hadoop, Spark Preferred Skill Sets Big Data, Python, Hadoop, Spark Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Big Data Optional Skills Python (Programming Language) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Hive is a popular data warehousing tool used for querying and managing large datasets in distributed storage. In India, the demand for professionals with expertise in Hive is on the rise, with many organizations looking to hire skilled individuals for various roles related to data processing and analysis.
These cities are known for their thriving tech industries and offer numerous opportunities for professionals looking to work with Hive.
The average salary range for Hive professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
Typically, a career in Hive progresses from roles such as Junior Developer or Data Analyst to Senior Developer, Tech Lead, and eventually Architect or Data Engineer. Continuous learning and hands-on experience with Hive are crucial for advancing in this field.
Apart from expertise in Hive, professionals in this field are often expected to have knowledge of SQL, Hadoop, data modeling, ETL processes, and data visualization tools like Tableau or Power BI.
As you explore job opportunities in the field of Hive in India, remember to showcase your expertise and passion for data processing and analysis. Prepare well for interviews by honing your skills and staying updated with the latest trends in the industry. Best of luck in your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2