Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 6 years
6 - 8 Lacs
Chennai, Hyderabad
Work from Office
What youll be doing... The Financial Planning & Analytics positions offer opportunities to drive better business partnering and insights, while developing your FP&A skill set and leadership as we continue to grow as a world class organization. You are a valuable business partner, finding new and better ways of working by leveraging digital capabilities. Youll become involved in, but not limited to, planning, reporting, analyses, and initiatives that will impact important decisions around the growth and development of Verizon business. Develop a team of professionals and guide overall strategic & tactical direction. Ensure timely and accurate delivery of financial planning, reporting & analyses for the business. Oversee the creation and communication of executive reviews, financial/operational storytelling, and insights. Drive continuous improvement and scope expansion efforts through project identification & prioritization, resource alignment & utilization, digital and automation tool & system integration strategy. Be a trusted advisor by providing valuable decision support and guidance to Business Partners. Perform and present operational reviews to FPAS executive leadership on the teams strategy, development, initiatives, accomplishments, risks, and areas of opportunity. Actively participate in the cultural and professional development of FPAS as a world class organization through Mentor/Mentee relationships and various activities offered. Driving data-derived insights across the business domain by developing advanced statistical models, machine learning algorithms and computational algorithms based on business initiatives. Directing the gathering of data, assessing data validity and synthesizing data into large analytics datasets to support project goals. Analyzing structured and unstructured data using data mining/statistical/machine learning/deep learning tools, Data visualization, Digital/Web analytics tools and techniques and storytelling. Building and training statistical models and machine learning algorithms. Preparing all the supporting documents and reports. What were looking for... You have strong analytical skills, and are eager to work in a collaborative environment with global teams to drive ML application in business problems, develop end to end analytical solutions and communicate insights and findings to leadership. You work independently and are always willing to learn new technologies. You thrive in a dynamic environment and are able to interact with various partners and multi-functional teams to implement data science driven business solutions. You'll need to have: Bachelor's degree or four or more years of work experience. Four or more years of relevant work experience. Experience in advanced analytics/ predictive modeling in a consulting role. Experience with all phases of end-to-end Analytics project, such as (but not limited to): Ingestion, Munging, Model Building, Validation, Operationalization, Monitoring. Experience building models using statistical techniques such as Supervised and Unsupervised Machine Learning models, Deep learning, Text classification etc. Knowledge of Python, Sql, PySpark, Scala and/or other languages and tools. Experience with visualization software like Qlik Sense, Tableau, or Looker. Even better if you have one or more of the following: Master's degree. Experience working in analytical projects in the telecom domain. Experience working in analytical projects in telecom (or) finance domain. Accuracy and attention to detail. Experience in SQL (Teradata, GCP), ETL and data lake. Experience in integrating any of the GenAI tools. Experience working with cross-functional teams in a dynamic environment. Excellent verbal and written communication. Experience presenting to and influencing functional leaders and partners. Ability to work effectively independently and willingness to learn new technologies. Experience working with cross-functional teams in a dynamic environment.
Posted 3 months ago
6 - 8 years
8 - 12 Lacs
Chennai, Hyderabad
Work from Office
What youll be doing... In this role as a Sr Engr in the Tech Strategy & Planning team for TSGI-Cloud and Enterprise Architecture team, you'll be managing multiple programs designed to help GTS move towards its strategic objectives. You will be a key contributor in planning and building Data, Platforms and Emerging Technology North star, and ensuring the smooth and successful execution of initiatives by coordinating with cross-functional teams. Your expertise will help us solve complex problems and find unique solutions to optimize the technology landscape of our organization. Your responsibilities include but are not limited to: Technology Road-mapping: Develop and maintain a technology roadmap that aligns with the organization's strategic goals and evolving industry trends. Strategic Planning: Collaborate with leadership to identify opportunities where technology can be a competitive advantage and contribute to the development of the company's overall strategic plan. Innovation and Research: Stay updated on emerging technologies, assess their relevance to the business, and propose innovative solutions. Vendor and Partner Management: Evaluate and manage relationships with technology vendors and partners to ensure they support the organization's strategic objectives. Evaluate the Gen AI tools in partnership with AI&D team Develop scalable prototypes of LLM and NLP Modules and systems which are critical to the companys product lines. Apply state-of-the-art LLM techniques to understand a large amount of unstructured data and translate them to a meaningful and structured data Design, develop and evaluate predictive LLM models that are on par with industry standards & define metrics that measure success and customer value delivery Work closely with process experts to analyze and design solutions based on business requirements. What were looking for... Youre analytical and great at quickly grasping challenging concepts. As a strong written and verbal communicator, you deliver complex messages vividly to technical and business audiences alike. Youre no stranger to a fast-paced environment and tight deadlines, and you adapt to changing priorities and balance multiple projects with ease. You take pride in your work and get a lot of satisfaction from meeting and exceeding the expectations of your customers Youll need to have: Bachelors degree or four or more years of experience. Minimum 6 years of experience in one or more of Data Science, LLM, Gen AI Established experience delivering information management solutions to large numbers of end users Experience in building LLM solutions to business problems across support, sales, digital, chat, voice etc., Familiarity with Gen AI Models (OpenAI, Gemini AI, Vertex.AI, Mistral, LlaMa, etc.,) and finetune based on domain specific needs Experience in text processing, Vector databases and embedding models Experience in NLP, Transformers and Neural Networks Hands-on experience in using Langchain and good exposure to LLMOps Strong independent and creative research skills necessary to keep up with the latest trends in advanced analytics and ML Research, recommend and implement best practices of LLM and NLP systems that can scale Identify new process opportunities, while quickly assessing feasibility Excellent written and verbal communication skills able to effectively communicate technical details to a non-technical audience as well as produce clear and concise written documentation Even better if you have one or more of the following: Familiarity with Graph DB Concepts Familiarity with one or more Data Platforms - CloudEra, Snowflake, DataBricks etc. Experience in one or more Big data and ETL technologies - Informatica, Talend, Teradata, Hadoop etc. Experience in one or more BI platforms - Tableau, Looker, Thoughtspot etc. Masters degree from an accredited college or university preferred.
Posted 3 months ago
4 - 8 years
6 - 10 Lacs
Chennai, Hyderabad
Work from Office
What youll be doing... The Commercial Data & Analytics - Impact Analytics team is part of the Verizon Global Services (VGS) organization.The Impact Analytics team addresses high-impact, analytically driven projects focused within three core pillars: Customer Experience, Pricing & Monetization, Network & Sustainability. In this role, you will analyze large data sets to draw insights and solutions to help drive actionable business decisions. You will also apply advanced analytical techniques and algorithms to help us solve some of Verizons most pressing challenges. Use your analysis of large structured and unstructured datasets to draw meaningful and actionable insights Envision and test for corner cases. Build analytical solutions and models by manipulating large data sets and integrating diverse data sources Present the results and recommendations of statistical modeling and data analysis to management and other stakeholders. Identify data sources and apply your knowledge of data structures, organization, transformation, and aggregation techniques to prepare data for in-depth analysis Understand data requirements, translate business needs into analytical solutions, and ensure the delivery of impactful insights Assist in building data views from disparate data sources which powers insights and business cases Apply statistical modeling techniques / ML to data and perform root cause analysis and forecasting Collaborate with cross-functional teams to discover the most appropriate data sources, fields which caters to the business needs Design modular, reusable Python scripts to automate data processing Locate and define new process improvement opportunities What were looking for... You have strong analytical skills, and are eager to work in a collaborative environment with global teams to drive ML applications in business problems, develop end to end analytical solutions and communicate insights and findings to leadership. You work independently and are always willing to learn new technologies. You thrive in a dynamic environment and are able to interact with various partners and cross functional teams to implement data science driven business solutions. You will need to Have: Bachelors degree or four or more years of work experience Four or more years of relevant work experience Demonstrated experience writing queries for reporting, analysis and extraction of data from big data systems (Google Cloud Platform, Teradata, Spark, Splunk etc) Curiosity to dive deep into data inconsistencies and perform root cause analysis Programming experience in Python (Pandas, NumPy, Scipy and Scikit-Learn) Experience with Visualization tools matplotlib, seaborn, tableau, grafana etc Understanding of statistical modeling like regression and other supervised machine learning models Understanding of time series modeling and forecasting techniques Even better if you have one or more of the following: Ability to collaborate effectively across teams for data discovery and validation Experience with Machine Learning, Statistics and Probability, NLP, Deep Learning especially experience in recommendation systems, conversational systems, information retrieval, computer vision, regression modeling Excellent interpersonal, verbal and written communication skills.
Posted 3 months ago
8 - 12 years
32 - 37 Lacs
Hyderabad
Work from Office
Job Overview: As Senior Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications: 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operatinghighly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 3 months ago
6 - 10 years
8 - 13 Lacs
Mumbai
Work from Office
Role & responsibilities 6+ years of experience with proven track of - Strong SQL background Strong DataStage tool experience Airflow, GitHub, Git Actions or Jenkins Teradata or similar DB experience Python and Spark background is plus
Posted 3 months ago
6 - 10 years
6 - 16 Lacs
Bangalore Rural
Work from Office
Skills Required: Experience in Python/Pyspark scripting, Experience in AWS Glue/lambda, Experience in Snowflake (sql, stored procs), Experience in SQL/Teradata/db2 Rdbms, Experience in any ETL tool like infra or datastage, Experience in Airflow or equivalent orchestration.
Posted 3 months ago
2 - 5 years
4 - 7 Lacs
Coimbatore
Work from Office
To support our extraordinary teams who build great products and contribute to our growth, were looking to add a Snowflake Developer - IT located in Coimbatore location. Reporting to the Manager , and the role involves: What a typical day looks like: Build ETL solutions including Loading data from disparate data sets (relational, structured, unstructured, flat files, etc.) Data engineering activities using any of the Programming languages like Python, C#/ Java, and Databases like Snowflake/ MySQL/ SQL Server/ Oracle etc. Data modeling, Designing, and building ETL jobs Perform POC for new technology and frameworks. Work as an individual contributor. Create comprehensive and accurate documentation of the infrastructure and implementation. The experience were looking to add to our team: Any Bachelor or Master degree or any equivalent degree. 2 - 5 years of experience with hands on experience of Snowflake developer, scripting, automation and building ETL solutions. Good knowledge of data structures, algorithms, theories, principles, and practices. Good understanding on one of programming language: Python /Java/ C#. Strong understanding of RBDMS concepts using MySQL/ Snowflake/ Teradata/ SQL Server/Oracle Knowledge of Linux Command Line and Shell Programming is a plus. If you are an OSS committer, that is a plus Knowledge of Git, Jenkins is a plus. Knowledge of any NoSQL is a plus Good written and verbal communication skills and should be a good team player. What youll receive for the great work you provide: Health Insurance Paid Time Off.
Posted 3 months ago
6 - 11 years
17 - 30 Lacs
Delhi NCR, Bengaluru, Hyderabad
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Microsoft ETL Lead Engineer | Database design |Agile Development Process | Release Management)! Position Overview: We are currently seeking a highly experienced ETL engineer with hands-on experience in Microsoft ETL (Extract, Transform, Load) technologies. The ideal candidate will have a deep understanding of ETL processes, data warehousing, and data integration, with a proven track record of leading successful ETL implementations. As an principal ETL engineer, you will play a pivotal role in architecting, designing, and implementing ETL solutions to meet our organization's data needs. Key Responsibilities: • Lead and design and development of ETL processes using Microsoft ETL technologies, such as SSIS (SQL Server Integration Services) • Mandatory hands on experience (70% development, 30% leadership) • Collaborate with stakeholders to gather and analyze requirements for data integration and transformation. • Design and implement data quality checks and error handling mechanisms within ETL processes. • Lead a team of ETL developers, providing technical guidance, mentorship, and oversight. • Perform code reviews and ensure adherence to best practices and coding standards. • Troubleshoot and resolve issues related to data integration, ETL performance, and data quality. • Work closely with database administrators, data architects, and business analysts to ensure alignment of ETL solutions with business requirements. • Stay up-to-date with the latest trends and advancements in ETL technologies and best practices • Identify and resolve performance bottlenecks.Implement best practices for database performance tuning and optimization. • Ensure data integrity, security, and availability • Create and maintain documentation for database designs, configurations, and procedures • Ensure compliance with data privacy and security regulations Qualifications: Education: • Bachelor’s degree in Computer Science, Information Technology, or a related field. Experience: • experience in designing, developing, and implementing ETL solutions using Microsoft ETL technologies, particularly SSIS. • Strong understanding of data warehousing concepts, dimensional modeling, and ETL design patterns. • Proficiency in SQL and experience working with relational databases, preferably Microsoft SQL Server. • Experience leading ETL development teams and managing end-to-end ETL projects. • Proven track record of delivering high-quality ETL solutions on time and within budget. • Experience with other Microsoft data platform technologies (e.g., SSAS, SSRS) is a plus • Familiarity with version control systems (e.g., Git) • Knowledge of containerization and orchestration (e.g., Docker, Kubernetes) is a plus Soft Skills: o Strong analytical and problem-solving skills o Excellent communication and collaboration abilities o Ability to work independently and as part of a team Preferred Qualifications o Experience with cloud-based database services (e.g., AWS RDS, Google Cloud SQL) o Knowledge of other database systems (e.g., PGSQL, Oracle) o Familiarity with Agile development methodologies Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 3 months ago
6 - 10 years
6 - 16 Lacs
Bengaluru
Work from Office
Skills Required: Experience in Python/Pyspark scripting, Experience in AWS Glue/lambda, Experience in Snowflake (sql, stored procs), Experience in SQL/Teradata/db2 Rdbms, Experience in any ETL tool like infra or datastage, Experience in Airflow or equivalent orchestration.
Posted 3 months ago
6 - 10 years
8 - 13 Lacs
Bengaluru
Work from Office
Role & responsibilities 6+ years of experience with proven track of - Strong SQL background Strong DataStage tool experience Airflow, GitHub, Git Actions or Jenkins Teradata or similar DB experience Python and Spark background is plus
Posted 3 months ago
6 - 10 years
8 - 13 Lacs
Hyderabad
Work from Office
Role & responsibilities 6+ years of experience with proven track of - Strong SQL background Strong DataStage tool experience Airflow, GitHub, Git Actions or Jenkins Teradata or similar DB experience Python and Spark background is plus
Posted 3 months ago
5 - 10 years
6 - 12 Lacs
Bengaluru
Work from Office
#Hiring #Top MNC #Hiring Alert
Posted 3 months ago
5 - 10 years
6 - 12 Lacs
Hyderabad
Work from Office
#Hiring #Top MNC #Hiring Alert
Posted 3 months ago
4 - 7 years
7 - 10 Lacs
Bangalore Rural
Work from Office
Hiring for DataStage Developer Contract Role Experience: 4+ Years Location: PAN India Sivasakthi sivashakthi@srsinfoway.com
Posted 3 months ago
4 - 7 years
7 - 10 Lacs
Pune
Work from Office
Hiring for DataStage Developer Contract Role Experience: 4+ Years Location: PAN India Sivasakthi sivashakthi@srsinfoway.com
Posted 3 months ago
10 - 20 years
45 - 50 Lacs
Hyderabad
Work from Office
Overview The primary focus would be to lead data architecture activities for critical projects. This role will be responsible for architecting, designing, and implementing Advanced Analytics capabilities and platform oversight within the Azure Data Lake, Databricks, and other related ETL technologies. Satisfy project requirements adhering to enterprise architecture standards. This role will translate business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses, and implement cutting edge solutions by building Semantic and Virtual data layers to provide faster and federated query execution Responsibilities ACCOUNTABILITIES Lead Data Architecture for critical data and analytics projects Drive and Deliver Data Architecture & Solution architecture deliverables such as conceptual, logical, and physical architecture Partner with Enterprise Architecture (Data & Analytics) and ensure the usage of standard patterns Partner with project leads, IT Leads, Security and Enterprise Architecture team members in architecting end to end solutions Gain Architecture alignment and sign-off and guide the project team during implementation Qualifications MANDATORY TECHNICAL SKILLS10+ years of experience in Teradata and Hadoop ecosystem (Ex: Hive, Spark, Kafka, HBase) , Azure cloud technologies 3 to 5 years of hands-on experience in architecting, designing, and implementing data ingestion pipes for batch, real-time, and streams on the Azure cloud platform at scale. 1 to 3 years of experience in using ETL tools like Informatica, ADF or similar tools, especially for a large volume of data. 1 to 3 years of hands-on experience on Databricks. 3 to 5 years of working experience on Azure cloud technologies like Spark, IoT, Synapse, Cosmos dB, Log analytics, ADF, ADLS, Blob storage, etc. 1 to 2 years of experience on distributed querying tools like Presto or similar tools. 1 to 2 years of experience on virtualization tools like Denodo or similar tools. 1 to 3 years of experience in evaluating emerging technologies is required. 1 to 3 years of experience in Python/ Pyspark/Scala to build data processing applications. Having experience in extracting/ querying/Joining large amounts of data sets at scale MANDATORY TECH SKILLS Highly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business dots Has strong communication and organizational skills and can deal with ambiguity while juggling multiple priorities and projects at the same time.MANDATORY NON TECH SKILLS Highly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business dots Has strong communication and organizational skills and can deal with ambiguity while juggling multiple priorities and projects at the same timeDIFFRENTIATING COMPETENCIES: Experience in data wrangling, advanced analytic modelling, and AI/ML capabilities is preferred Finance functional domain expertise.
Posted 3 months ago
12 - 16 years
20 - 35 Lacs
Pune
Work from Office
Senior Data Modeller More than 12-17 years of experience Banking DWH Project ,Teradata experience Teradata FSDM Banking Data model implementation experience ( if not at least IBM BDW – which is similar, but FSDM is will be a big plus Required Candidate profile Current expectation is onsite English is required, Arabic preferred and will be plus
Posted 3 months ago
3 - 5 years
8 - 12 Lacs
Bengaluru
Work from Office
The primary responsibility is to report, analyze and provide insights on Customer Experience at Lowe s across selling channels, customer segments and product categories. The individual will apply analytical methods to combine internal and external data and analyze trends in Customer Experience Metrics and the factors that play a key role driving improvement in those metrics. This position is responsible for following best practices in turning business questions into data analysis, analyzing results and identifying insights for decision making; determine additional research/analytics that may be needed to enhance the solution; and coordinate with cross-functional teams to ensure project continuity and strategic alignment. The individual will also have to proactively take up initiatives to apply modern tools and techniques to improve efficiency, accuracy and overall quality of insights offered to stakeholders. Core Responsibilities: Analyze Customer feedback, LTR, NPS data to understand in-market trends and where to focus to improve Customer experience using Medallia platform / create reports / dashboards using Medallia capabilities. Work with our US (Mooresville) team to assist them in defining various reporting / analysis needs and building appropriate methodologies to provide actionable insights on Experience Metrics. Identifying the appropriate univariate and multivariate analysis to identify key customer trends and insights - Segmentation, Bayesian Networks, Factor analysis etc Synthesize disparate sources of data primary and secondary to develop cohesive stories, trends and insights that will drive strategic decision making across the enterprise. Leverage available information across workstreams to help connect dots and provide holistic insights on Customer Experience trends and Strategy. Work with the data operation teams to enhance data capabilities and develop tools to improve ease of access to data and BI for the broader organization Years of Experience: 3-5 Years Hands on experience with Customer Experience Analytics / Customer Analytics / Customer Insights Required Minimum Qualifications : masters degree in economics / Statistics / Analytics or MBA in Marketing Primary Skills (must have) Hands on experience in Medallia Platform, SQL, Teradata, Hadoop, Python Hands-on Analytics Experience of building Statistical / Mathematical models and multivariate analysis like Segmentation, Logistic Regression, Bayesian Networks, Factor analysis, Conjoint analysis, etc Ability to apply Analytical tools and techniques to extract insights from Structured and Unstructured data. Consulting Skills - Ability to impact business decisions through analytics and research. Hands on experience in creating Executive level audience ready presentations to tell impactful stories. Excellent communication skill to connect with people from diverse background and experience Secondary Skills (desired) Experience on working with text data would be an advantage. Experience in working with Customer Experience or Voice of Customer Metrics will be a good to have. Familiarity with retail industry and key concepts
Posted 3 months ago
7 - 12 years
10 - 20 Lacs
Chennai, Hyderabad
Work from Office
• Mandatory Skill: Redshift, Teradata, ETL Tool, Kafka/Kinesis, Airflow, LSF scheduler etc. • Machine learning
Posted 3 months ago
3 - 8 years
8 - 18 Lacs
Chennai, Hyderabad
Work from Office
Title : Association/Senior/Lead -Data Engineer Experience : 3 to 8 Years Location : Chennai/Hyderabad Required Skill : GCP, BigQuery, Python/Hadoop, Teradata/DataProc, Airflow. Regards, Sharmeela Sharmeela.s@saaconsulting.co.in
Posted 3 months ago
6 - 11 years
15 - 20 Lacs
Pune
Work from Office
Job Position: Informatica/DM Express Experience: 5+ years Location: Pune Notice Period: Immediate joiner Mandatory Skills Description: 5+ years' experience in Informatica/DM Express ETL Tools. 5+ years' experience SQL Server/DB2/Teradata, should be able to design complex procedure, query performance optimization. 3+ years experience in Unix Shell Scripting. SQL for data profiling, and strong knowledge for navigating databases and data structures, experience with operational data stores and/or enterprise data warehouses preferred Have experience in GCP(Big Query) for Query designing and ETL workflow design. Should have reporting tools experience. Should have experience working in an application development building and supporting downstream SAP Business Objects reporting solutions Must have played the functional role in gathering requirements and working with business users. Experience gathering/analysing requirements and translating them into detailed functional specifications. Strong analytical and problem-solving skills. Strong written and oral communication skills and an open communication style. Strong execution skills with the ability to work efficiently and deliver quality results within standards to meet deadlines.
Posted 3 months ago
4 - 6 years
6 - 10 Lacs
Bengaluru
Work from Office
Education: Any Bachelors/masters degree Experience: Overall 4-6 Years of IT experience Primary Skills: 4+ years of experience with Power BI Working knowledge of Power BI, SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Well versed with all BI and DWH (Data Ware Housing) concepts and architecture. Familiarity with the tools and technologies used by the Microsoft SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Knowledge on Scripts to import data from databases, flat files, log files. Design, build, maintain, and map data models to process raw data from unrelated sources. Transform complex data into easily understandable insights. Create multi-dimensional data models that are well-adjusted data warehousing practices. Execute security at the row level in the Power BI application with an apt understanding of the application security layer models. In-depth understanding of the overall development process for listed tools: Data extraction from various data sources (like SAP ERP, SAP BW, Oracle, Teradata, Snowflake) Data transformation and performance optimized modelling Data visualization using best practices with high end-user focus Application of complex authorization rules Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) Good to have: Experience with Qlik Sense Qlik Cloud Knowledge/Experience in Power Automate and Power Apps. Official certification as Qlik Sense, QlikView and/or Power BI developer Experience in advanced analytics using R or Python Experience for mashup and extension development Experience in Healthcare Domain Mandatory Skills Power BI Qlik Sense Qlik Cloud Power apps DAX.
Posted 3 months ago
8 - 9 years
18 - 23 Lacs
Hyderabad
Work from Office
At least 8 years of experience in Big Data, Pyspark, GCP , Hadoop Data Platform (HDP) and ETL and capable of configuring data pipelines. Possess the following technical skills SQL, Python, Pyspark, Hive, Unix, ETL, Control-M (or similar) Skills and experience to support the upgrade from Hadoop to Cloudera. Good knowledge of Industry Best Practice for ETL Design, Principles, Concepts. Data Extraction from Teradata, HIVE, Oracle, and Flat files Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies. Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount. Ability to deliver materials of the highest quality to management against tight deadlines. Ability to work effectively under pressure with competing and rapidly changing priorities. Responsible for automating the continuous integration/continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement Keep up-to-date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable. Requirements To be successful in this role, you should meet the following requirements: 8+ Years of IT Experience. Working knowledge on leading Regulatory projects. Possess in-depth knowledge and understanding of technical concepts, systems, and processes relevant to the organization. Strong working knowledge on Pyspark, GCP, Dataproc, Airflow. Hands-on experience in Python, Spark, Hive, SQL, GCP and Hadoop technologies Good to have exposure in Cloudera Data Platform (CDP), Big Data, Hadoop Data Platform (HDP) and ETL and capable of configuring data pipelines. Skills and experience to support the upgrade from Hadoop to Cloudera. Good knowledge of Industry Best Practice for ETL Design, Principles, Concepts. Data Extraction from Teradata, HIVE, Oracle, and Flat files. Experience in handling/Conducting scrum ceremonies and practice Agile best practices. Working knowledge on DevOps best practices and building CI/CD pipelines with end-to-end automation. Ability to work independently on specialized assignments within the context of project deliverables .
Posted 3 months ago
11 - 19 years
50 - 80 Lacs
Hyderabad
Work from Office
At Finance Automation, we are passionate about building systems and services that deliver a seamless and transparent finance experience for Amazon partners. We are responsible for building the systems that automate and orchestrate the services, analytics, and financial transactions that supports Amazon customers and vendors worldwide. Our talented teams span across the globe and depend on our systems to perform their work, and serve customers efficiently, effectively, and securely. We build, operate, and scale systems that are responsible for billions of dollars in transactions, and are central to the success of worldwide finance. If youre interested in building the next generation of financial, distributed, and directed work systems that supports Amazon scale, and want to have a direct and immediate impact with customers, this job is for you. We are seeking a strong, experienced Data Engineering Manager who will lead the technical design, development and launch of a new global system that will serve finance needs worldwide. Their leadership will deliver a new system and set of services that will be at the core of making it easier for customers and vendors to do business with Amazon Retail, AWS etc. A strong candidate will have experience with designing and building secure, distributed systems that operate globally. They will have experience in building on AWS, and be responsible for the architecture, design and implementation that will provide availability, and latency guarantees. Finally, they will be leaders on the team, and will raise the bar for the organization by mentoring junior engineers, interviewing and setting a high bar for coding standards. Key job responsibilities 5+ years of experience as a Data Engineer or in a similar role Experience managing a data or BI team Experience with data modeling, data warehousing, and building ETL pipelines Experience leading and influencing the data strategy of your team or organization Experience in SQL A Bachelors degree in Computer Science or similar major Hands-on experience leading delivery and operations of distributed, high quality systems Fundamentals in data structures, algorithm, problem solving, and complexity analysis - 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience - 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience - 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience - 5+ years of data engineering experience - Experience managing a data or BI team - Experience communicating to senior management and customers verbally and in writing - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
Posted 3 months ago
13 - 20 years
20 - 35 Lacs
Chennai, Bengaluru, Coimbatore
Hybrid
Job description: We are looking for Snowflake Architect who is having experience in below technical stack. Please revert if you are interested with below requested information. Required Skills: Snowflake, Oracle, Teradata, SQL Server, BI, ETL, AWS, SQL, Python, Spark Qualification: Any Degree Experience: 13-20 Years Location: Chennai / Bangalore / Pune / Mumbai / Coimbatore Work Time: 2pm to 11pm (UK shift time) Work Mode: Hybrid (3 Days WFO and 2 Days WFH) Role & responsibilities: Proficient in data warehouses such as Snowflake, Teradata, Oracle, and SQL Server. Familiar with common BI and ETL solutions in the industry. Experienced with cloud environments, specifically AWS integrations. Extensive experience in defining and rationalizing complex data models. Skilled in enterprise solution architecture using Snowflake and IICS. Capable of capturing customer requirements and conducting platform demos. Demonstrates technical leadership through personally developed code, solution architectures, and implementation recommendations. Acts as a trusted advisor to customer executive leadership, providing thought leadership and solution approaches. Extracts business requirements from customers and translates them into well-architected recommendations. Leverages best practices and solutions to minimize risks in customer implementations. Experienced in implementing and operating Snowflake-centric solutions. Proficient in implementing data security measures, access controls, and design within the Snowflake platform. Hands-on experience with SQL, Python, Java, and/or Spark for building, operating, and maintaining data analytics solutions is an added advantage.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.
These cities are known for their thriving tech industries and have a high demand for Teradata professionals.
The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.
In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.
In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.
As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2