Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Summary Position Summary Strategy & Analytics Strategy Our Strategy practice brings together several key capabilities that will allow us to architect integrated programs that transform our clients’ businesses, including Corporate & Business Unit Strategy, Technology Strategy & Insights, Enterprise Model Design, Enterprise Cloud Strategy and Business Transformation. Strategy professionals will serve as trusted advisors to our clients, working with them to make clear data-driven choices about where to play and how to win, in order to drive growth and enterprise value. Strategy will help our clients: Identify strategies for growth and value creation Develop the appropriate business models, operating models, and capabilities to support their strategic vision Maximize the ROI on technology investments and leverage technology and Cloud trends to architect future business strategies AI&Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The Analytics & Cognitive team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Analytics & Cognitive will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Abinitio Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA 6-9 years of experience in design and implementation of database migration and integration solutions for any Data warehousing project. Required Skills Good knowledge of DBMS concepts, SQL, and PL/SQL. Good knowledge of Snowflake system Hierarchy. Good knowledge of Snowflake schema’s/tables/views/stages etc. Should have Strong hands-on experience with ABINITIO development. Should have strong problem solving and analytical capabilities. Should have hands-on experience in the following: data validation, writing custom SQL code, managing the Snowflake account /users/roles and privileges. Should have experience in integrating using Abinitio with Snowflake, AWS S3 bucket Should have experience in integrating any BI tool like Tableau, Power BI with Snowflake. Should have experience in fine tuning and troubleshooting performance issues. Should be well versed with understanding of design documents like HLD, LLD etc. Should be well versed with Data migration and integration concepts. Should be self-starter in solution implementation with inputs from design documents Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc. Should have hands-on development experience with various Ab Initio components such as Rollup, Scan, j join, Partition by key, Partition by Round Robin, Gather, Merge, Interleave, Lookup, etc. Good knowledge on Designs, codes, tests, debug and document software and enhance existing components to ensure that software meets business needs. Should have participated in Preparing design document for any new development or enhancement of the data mart Constant communication and follow up with stakeholder. Good knowledge in developing UNIX scripts. Should have hands-on Different databases like Teradata, SQL Server. Should have experience on Autosys. Experience in all aspects of Agile SDLC, and end to end participation in a project lifecycle Preferred Skills Exposure to Data Modelling concepts is desirable. Exposure to advanced Snowflake features like Data sharing/Cloning/export and import is desirable. Participation in client interactions/meetings is desirable. Participation in code-tuning is desirable. Exposure to AWS platform is desirable. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300132
Posted 1 month ago
8.0 years
0 Lacs
Panchkula, Haryana, India
On-site
Job Description We are looking for a skilled and experienced Lead/Senior ETL Engineer with 4–8 years of experience to join our data engineering team. In this role, you will be responsible for designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. You’ll also contribute to architectural decisions, lead delivery planning, and provide mentorship to team members. Your hands-on expertise in ETL tools, cloud platforms, and scripting will be key to building efficient, scalable, and reliable data solutions for enterprise-level implementations. Key Skills Strong hands-on experience with ETL tools like SSIS, DataStage, Informatica, or Talend. Deep understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, Fact & Dimension tables. Proficient in working with relational databases: SQL Server, Oracle, Teradata, DB2, or MySQL. Solid scripting/programming skills in Python. Hands-on experience with cloud platforms such as AWS or Azure. Knowledge of middleware architecture and enterprise data integration strategies. Familiarity with reporting/BI tools like Tableau and Power BI. Strong grasp of data modeling principles and performance optimization. Ability to write and review high and low-level design documents. Strong communication skills and experience working with cross-cultural, distributed teams. Roles And Responsibilities Design and develop ETL workflows and data integration strategies. Create and review high and low-level designs adhering to best practices. Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. Coach and mentor junior engineers to support skill development and performance. Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. Participate in planning, estimations, and recruitment activities. Work on multiple projects simultaneously, ensuring quality and consistency in delivery. Experience in Sales and Marketing data domains. Exposure to reporting and analytics projects. Strong problem-solving abilities with a data-driven mindset. Ability to work independently and collaboratively in a fast-paced environment. Prior experience in global implementations and managing multi-location teams is a plus.
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Teradata BI Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing solutions to enhance business operations and streamline processes in a dynamic environment. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to design, develop, and implement software solutions. - Troubleshoot and debug applications to ensure optimal performance. - Develop and maintain technical documentation for reference and reporting purposes. - Implement best practices for software development and adhere to coding standards. - Stay updated with emerging technologies and trends in the software development field. Professional & Technical Skills: - Must To Have Skills: Proficiency in Teradata BI. - Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL). - Strong understanding of data warehousing concepts and ETL processes. - Experience in performance tuning and optimization of Teradata BI solutions. - Knowledge of database design principles and data modeling. - Familiarity with Agile methodologies and software development lifecycle. Additional Information: - The candidate should have a minimum of 3 years of experience in Teradata BI. - This position is based at our Chennai office. - A 15 years full-time education is required. 15 years full time education
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do Participate in monthly close activities, including preparing journal entries and conducting analysis of GL activity Review, record and assess Investment and Equity eliminations to the consolidated financial statements Prepare balance sheet reconciliations, investigate discrepancies, and ensure financial data integrity Recommend reclassification and adjustments as needed Participate in quarterly balance sheet review and flux analysis Support corporate departments such as HR, Marketing, Tax, Treasury, etc. Support internal and external audit requests Interpret financial reports to communicate key findings to stakeholders Document systems and processes and ensure compliance with SOX requirements Handle ad hoc reporting and projects as needed Who You’ll Work With The GL Staff Accountant will be part of the Corporate Accounting team within the Teradata Corporate Controller’s group based out of Atlanta, Georgia. The Corporate Accounting team is responsible for the administration and performance of a wide array of general ledger accounting functions. Team members are expected to support a variety of corporate departments and interact with various levels of management. This role will work closely with the corporate and regional accounting teams and report directly to the Corporate Accounting Manager. We’re seeking a detail-oriented and motivated accountant to join our team. The role will play a key part in supporting month-end processes, maintaining accurate financial records, and contributing to the overall success of the accounting operations. Candidates must be comfortable executing and troubleshooting on their own, while hitting tight deadlines. What Makes You a Qualified Candidate Advanced problem-solving skills with the ability to interpret data to reach conclusive results Apply critical thinking to analysis to ensure accuracy and GAAP compliance Must possess a basic understanding of GAAP accounting principles Must be a strong team player who collaborates well with others with different personalities and backgrounds Will need to be a quick learner with the desire to improve skill sets Needs to be flexible and willing to adapt to the ever-changing business environment Will need strong time management skills with the ability to effectively prioritize tasks Needs to be able to work well under pressure and to meet tight deadlines Needs to be comfortable working in a fully remote environment Bachelor’s degree in accounting, finance, or related field 5 to 7 years of relevant accounting experience Exceptional problem-solving skills Well-developed Microsoft Office skills, including intermediate Excel proficiency Strong oral and written communication skills What You’ll Bring Experience with Oracle Cloud ERP, specifically general ledger Smart View experience Experience working in accounting for publicly traded companies with international operations Experience preparing and booking journal entries Power BI experience or similar Business Intelligence tools Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Req ID: 324375 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in hyderabad, Telangana (IN-TG), India (IN). Working with SAP BOBJ Servers 5 + years. Business Objects Suite of Products in entire spectrum of sizing, Installation, Content management, Security configuration, Administration, Troubleshooting, Design and Development on premises, Hybrid, Cloud environments in windows and Linux/Unix platforms including development. Good knowledge on Business Objects XI 3.1 & 4.1, SQL server, Teradata,DB2 Expertise in migration of SAP BO between Cross platform between windows to Linux and version 3.x to 4.x. Have practiced SAP HANA modeling, SAP BODS and SAP Dash Board Design tool, Lumira. Extensive experience in the IT industry and specialize in developing ITSM Operational reports using SAP BO 3.1/4.x Web intelligence, Crystal reports and working on Universe Design (IDT) to Design develop and maintain universes. Good Trouble-shooting experience -environment issues to channel technical issues into various components within BO (Web, SDK, etc.) Have good experience on being able to trace issues using troubleshooting tools & also able to fix/repair them as per vendor recommendations Work closely with SAP Support to segregate issues in right buckets & co-ordinate patches into the environment Understand the business requirements. Interact with Architect to provide solutions. guide Bo team for critical areas. mentor them to grow to next level 1.Worked on BO Server and client Installations and Configarations 2.User Access and Security setup. 3.Worked on IDT and UDT Universes 4.Server maintanance and backups. 5.Migration of Objects between different environments. #LI-INPAS
Posted 1 month ago
10.0 - 15.0 years
30 - 35 Lacs
Hyderabad
Work from Office
If you are excited about shaping the future of technology and driving significant business impact in financial services, we are looking for people just like you. Join our team and help us develop game-changing, high-quality solutions. As a Senior Lead Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you will be a key member of the Data Product Solutions Architecture Team. Your role involves designing, developing, and implementing analytical data solutions that align with the organizations strategic goals. You will leverage your expertise in data architecture, data modeling, data migrations and data integration, collaborating with cross-functional teams to achieve target state architecture goals. Job responsibilities Represent the Data Product Solutions Architecture team in various forums, advising on Data Product Solutions. Lead the design and maintenance of scalable data solutions, including data lakes and warehouses. Collaborate with cross-functional teams to ensure data product solutions supports business needs and enables data-driven decision-making. Evaluate and select data technologies, driving the adoption of emerging technologies. Develop architectural models using Archimate, C4 Model, etc. and other artifacts to support data initiatives. Serve as a subject matter expert in specific areas. Contribute to the data engineering community and advocate for firmwide data practices. Engage in hands-on coding and design to implement production solutions. Optimize system performance by resolving inefficiencies. Influence product design and technical operations. Develop multi-year roadmaps aligned with business and data technology strategies. Design reusable data frameworks using new technologies. Required qualifications, capabilities, and skills Bachelors or Masters degree in Computer Science or related field with 10+ years of experience. 5+ years as a Data Product Solution Architect or similar role leading technologists to manage, anticipate and solve complex technical items within your domain of expertise. Hands-on experience in system design, application development, and operational stability. Expertise in architecture disciplines and programming languages. Deep knowledge of data architecture, modeling, integration, cloud data services, data domain driven design, best practices, and industry trends in data engineering Practical experience with AWS, big data technologies, and data engineering disciplines. Advanced experience in one or more data engineering disciplines, e.g. streaming, ELT/ELT, event processing. Proficiency in SQL and data warehousing solutions using Teradata or similar cloud native relational databases, e.g. Snowflake, Athena, Postgres Strong problem-solving, communication, and interpersonal skills. Ability to evaluate and recommend technologies for future state architecture. Preferred qualifications, capabilities, and skills Financial services experience, especially in card and banking. Experience with modern data processing technologies such as Kafka streaming, DBT, Spark, Python, Java, Airflow, etc. using data mesh & data lake. Business architecture knowledge and experience with architecture assessment frameworks. If you are excited about shaping the future of technology and driving significant business impact in financial services, we are looking for people just like you. Join our team and help us develop game-changing, high-quality solutions. As a Senior Lead Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you will be a key member of the Data Product Solutions Architecture Team. Your role involves designing, developing, and implementing analytical data solutions that align with the organizations strategic goals. You will leverage your expertise in data architecture, data modeling, data migrations and data integration, collaborating with cross-functional teams to achieve target state architecture goals. Job responsibilities Represent the Data Product Solutions Architecture team in various forums, advising on Data Product Solutions. Lead the design and maintenance of scalable data solutions, including data lakes and warehouses. Collaborate with cross-functional teams to ensure data product solutions supports business needs and enables data-driven decision-making. Evaluate and select data technologies, driving the adoption of emerging technologies. Develop architectural models using Archimate, C4 Model, etc. and other artifacts to support data initiatives. Serve as a subject matter expert in specific areas. Contribute to the data engineering community and advocate for firmwide data practices. Engage in hands-on coding and design to implement production solutions. Optimize system performance by resolving inefficiencies. Influence product design and technical operations. Develop multi-year roadmaps aligned with business and data technology strategies. Design reusable data frameworks using new technologies. Required qualifications, capabilities, and skills Bachelors or Masters degree in Computer Science or related field with 10+ years of experience. 5+ years as a Data Product Solution Architect or similar role leading technologists to manage, anticipate and solve complex technical items within your domain of expertise. Hands-on experience in system design, application development, and operational stability. Expertise in architecture disciplines and programming languages. Deep knowledge of data architecture, modeling, integration, cloud data services, data domain driven design, best practices, and industry trends in data engineering Practical experience with AWS, big data technologies, and data engineering disciplines. Advanced experience in one or more data engineering disciplines, e.g. streaming, ELT/ELT, event processing. Proficiency in SQL and data warehousing solutions using Teradata or similar cloud native relational databases, e.g. Snowflake, Athena, Postgres Strong problem-solving, communication, and interpersonal skills. Ability to evaluate and recommend technologies for future state architecture. Preferred qualifications, capabilities, and skills Financial services experience, especially in card and banking. Experience with modern data processing technologies such as Kafka streaming, DBT, Spark, Python, Java, Airflow, etc. using data mesh & data lake. Business architecture knowledge and experience with architecture assessment frameworks.
Posted 1 month ago
3.0 - 6.0 years
5 - 9 Lacs
Chennai
Work from Office
We are seeking an experienced Azure Data Engineer to join our team in a hybrid Developer/Support capacity. This role focuses on enhancing and supporting existing Data & Analytics solutions by leveraging Azure Data Engineering technologies. The engineer will work on developing, maintaining, and deploying IT products and solutions that serve various business users, with a strong emphasis on performance, scalability, and reliability. Must-Have Skills: Azure Databricks PySpark Azure Synapse Analytics Key Responsibilities: Incident classification and prioritization Log analysis and trend identification Coordination with Subject Matter Experts (SMEs) Escalation of unresolved or complex issues Root cause analysis and permanent resolution implementation Stakeholder communication and status updates Resolution of complex and major incidents Code reviews (Per week 2 per individual) to ensure adherence to standards and optimize performance Bug fixing of recurring or critical issues identified during operations Gold layer tasks, including enhancements and performance tuning. Design, develop, and support data pipelines and solutions using Azure data engineering services. Implement data flow and ETL techniques leveraging Azure Data Factory, Databricks, and Synapse. Cleanse, transform, and enrich datasets using Databricks notebooks and PySpark. Orchestrate and automate workflows across services and systems. Collaborate with business and technical teams to deliver robust and scalable data solutions. Work in a support role to resolve incidents, handle change/service requests, and monitor performance. Contribute to CI/CD pipeline implementation using Azure DevOps. Technical Requirements: 3 to 6 years of experience in IT and Azure data engineering technologies. Strong experience in Azure Databricks, Azure Synapse, and ADLS Gen2. Proficient in Python, PySpark, and SQL. Experience with file formats such as JSON and Parquet. Working knowledge of database systems, with a preference for Teradata and Snowflake. Hands-on experience with Azure DevOps and CI/CD pipeline deployments. Understanding of Data Warehousing concepts and data modeling best practices. Familiarity with SNOW (ServiceNow) for incident and change management. Non-Technical Requirements: Ability to work independently and collaboratively in virtual teams across geographies. Strong analytical and problem-solving skills. Experience in Agile development practices, including estimation, testing, and deployment. Effective task and time management with the ability to prioritize under pressure. Clear communication and documentation skills for project updates and technical processes. Technologies: Azure Data Factory Azure Databricks Azure Synapse Analytics PySpark / SQL Azure Data Lake Storage (ADLS), Blob Storage Azure DevOps (CI/CD pipelines) Nice-to-Have: Experience with Business Intelligence tools, preferably Power BI DP-203 certification (Azure Data Engineer Associate)
Posted 1 month ago
3.0 - 5.0 years
7 - 11 Lacs
Bengaluru
Work from Office
The primary purpose of this role is to perform mathematical and statistical analysis or model building as appropriate. This includes following analytical best practices, analyzing and reporting accurate results, and identifying meaningful insights that directly support decision making. This role provides assistance in supporting one functional area of the business in partnership with other team members. At times, this role may work directly with the business function, but the majority of time is spent working with internal team members to identify and understand business needs. With a focus specifically on Digital Analytics Implementation , this role assists with providing data capture capabilities to support digital analytics initiatives. This role helps translate business needs to effective digital analytics tagging specifications that provide metrics for analytic solutions across various digital channels. These tagging specifications ensure that data can be translated to tangible results for Lowes and Lowes Digital. To be successful, the individual in this role must have technical expertise in programs such as SQL or Python, know how to query data, and have the ability to use operational and customer analytical data, merge data sets from different systems, summarize and visualize findings using visualization tools (e.g., MicroStrategy VI) to provide actionable insights to business partners. Qualifications Bachelor s Degree in Business, Economics, Engineering, Statistics, Data or Information Sciences, or related field (or equivalent work experience in a related field) AND 1 year of related experience OR Masters Degree in Business, Economics, Engineering, Statistics, Data or Information Sciences, or related field ALSO Experience using basic analytical tools such as R, Python, SQL, SAS, Adobe, Alteryx, Knime, Aster Experience using visualization tools such as MicroStrategy VI, Power BI, Tableau Preferred Qualifications Masters Degree in Business, Engineering, Statistics, Economics or related area Experience with business intelligence and reporting tools (e.g., MicroStrategy, Business Objects, Cognos, Adobe, TM1, Alteryx, Knime, SSIS, SQL, Svr) and Enterprise level databases (Hadoop, GCP, Azure, Oracle, Teradata, DB2) Experience working with big, unstructured data in a retail environment Experience with analytical tools like Python, Alteryx, Knime, SAS, R, etc. Experience with visualization tools like MicroStrategy VI, Power BI, SAS-VA, Tableau, D3, R-Shiny Programming experience using tools such as R, Python Data Science experience using tools such as ML, Text mining Knowledge of SQL Project management experience Experience in home improvement retail Digital Analytics Implementation 1 year of experience in digital analytics implementation using enterprise grade tools such as Adobe Dynamic Tag Management (DTM), Google Tag Manager (GTM), Tealium, Ensighten, etc. (specific to the Digital Analytics Implementation role)
Posted 1 month ago
2.0 - 7.0 years
2 - 6 Lacs
Bengaluru
Work from Office
The principle purpose of the Analyst is to work with the Sr. Analyst/Lead Analyst in delivering impactful data driven analytics support to the Merchant organization. This position is responsible for following analytical best practices, accurately reporting and analyzing results and identifying insights for decision making. The Analyst is responsible for developing an unbiased, holistic view of the key drivers of business performance. This is accomplished utilizing advanced tools and methods to leverage customer and transaction data. To accomplish this, the Analyst must have decent knowledge of retail analytics. This individual will work with the Sr. Analyst and/or the Lead Analysts to ensure that insights from USHI functional areas and business units (BUs) are understood and incorporated into work products. Additionally, this individual must be able to work effectively within a matrixed organization and demonstrate self-leadership skills Core Responsibilities: Understands retail analytics and works effectively under the guidance of a Sr. Analyst and/or Lead Analyst to deliver impactful data driven analytics insights/recommendations Builds and/or validates recommendations based on sound methodology, data gathering, and data analysis; uses data driven conclusions and decisions to provide solutions Communicates observations and/or insights to the Sr. Analyst and/or Lead Analysts to help prepare analyses through leveraging multiple data sources Demonstrate resourcefulness and resilience in the face of change, obstacles, and adversity. This includes adapting to competing demands and shifting priorities. This also includes improving adaptability, pursuing new skills and knowledge, and regularly seeking feedback from others Primary Skills (must have) 2+ years of experience in Data Analytics Wrangling data using complex SQLs and data extraction processes using data tools like SQL Assistant, DBeaver etc across databases in Teradata, Hadoop Proficiency in doing data transformations & creating reports using MS Excel Working with large data sets & generating actionable insights Demonstrated experience working cross-functionally and with global partners Experience with other Microsoft Office tools (e.g., Word, PowerPoint) Secondary Skills (desired) 1 year experience in Retail category analytics working directly with merchants Experience with business intelligence and reporting tools (e.g., PowerBI, MicroStrategy, Tableau etc.) Experience working with Big data & data transformation tools such as Python, SAS, R etc
Posted 1 month ago
14.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Data warehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Datawarehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs
Posted 1 month ago
0.0 - 4.0 years
9 - 13 Lacs
Hyderabad
Work from Office
As a Software Engineer II at JPMorgan Chase within the Corporate Technology, you are part of an agile team that works to enhance, design, and deliver the software components of the firm s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Building pipelines in spark, tuning spark queries Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Stay up-to-date with the latest advancements in GenAI and LLM technologies and incorporate them into our data engineering practices. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Background with Machine Learning Frameworks and Big Data technologies such as Hadoop. Strong experience in programming languages such as Java or Python Python Machine Learning library and ecosystem experience ( Pandas and Numpy etc) Experience with Cloud technologies such as AWS or Azure. Experience working with databases such as Cassandra, MongoDB or Teradata Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Experience with Generative AI and Large Language Models, and experience integrating these technologies into data workflows Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies As a Software Engineer II at JPMorgan Chase within the Corporate Technology, you are part of an agile team that works to enhance, design, and deliver the software components of the firm s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Building pipelines in spark, tuning spark queries Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Stay up-to-date with the latest advancements in GenAI and LLM technologies and incorporate them into our data engineering practices. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Background with Machine Learning Frameworks and Big Data technologies such as Hadoop. Strong experience in programming languages such as Java or Python Python Machine Learning library and ecosystem experience ( Pandas and Numpy etc) Experience with Cloud technologies such as AWS or Azure. Experience working with databases such as Cassandra, MongoDB or Teradata Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Experience with Generative AI and Large Language Models, and experience integrating these technologies into data workflows Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies
Posted 1 month ago
7.0 - 10.0 years
22 - 30 Lacs
Hubli, Mangaluru, Mysuru
Work from Office
Responsible for promoting the use of industry and Company technology standards. Monitors emerging technologies/technology practices for potential use within the Company. Designs and develops updated infrastructure in support of one or more business processes. Helps to ensure a balance between tactical and strategic technology solutions. Considers business problems "end-to-end": including people, process and technology, both within and outside the enterprise, as part of any design solution. Mentors, reviews codes and verifies that the object oriented design best practices and that coding and architectural guidelines are adhered to. Identifies and drives issues through closure. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Core Responsibilities Ensures programs are envisioned, designed, developed and implemented across the enterprise to meet business needs. Interfaces with the enterprise architecture team and other functional areas to ensure that most efficient solution is designed to meet business needs. Ensures solutions are well engineered, operable, maintainable and delivered on schedule. Tracks and documents requirements for enterprise development projects and enhancements. Monitors current and future trends, technology and information that will positively affect organizational projects; applies and integrates emerging technological trends to new and existing systems architecture. Presents solutions to senior architects. Plans and designs new or upgraded systems. Interacts with departments to implement improvements in process. Develops solution architecture (both tactical and strategic) to fully manage/support enterprise needs, services, systems and technology management. Mentors team members in relevant technologies and implementation architecture. Develops, documents and ensures compliance with best practices including but not limited to the following coding standards, object oriented designs, platform and framework specific design concerns and human interface guidelines. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Strong experience in end to end data management - data acquisition, data model, data storage, data access - with various tools and technologies Solid job experience with Data Modeling tools such as PowerDesigner, ERwin, or Visio Strong experience in data management and architecture in RDBMS. SQL based tools - with hands on experience Experience with large implementations of both relational and dimensional reporting systems Teradata implementation and design experience (especially with an LDM) is critical Strong SQL skills for data validation, analysis, and troubleshooting Strong experience in real time data acquisition in a data warehouse environment Participate in group meetings to present and defend design decisions Good to have experience in AWS Cloud Availability, Capacity, Disaster Recovery planning, implementing security policies, managing PII data Experience in Big Data/Cloud Technologies like AWS S3, MinIO, Hive, Spark, Redshift, Athena, Presto Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relevant Work Experience 7-10 Years
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Hyderābād
On-site
Skills: 3-5 years of Data Warehouse, Business Intelligence work experience required working with Talend Open Studio Extensive experience with Talend Real-Time Big Data Platform in the areas of design, development and Testing with focus on Talend Data Integration and Talend Big-data Real Time including Big data Streaming (Spark) jobs with different databases Experience working with data bases like Greenplum, HAWQ, Oracle, Teradata, MsSql Server, Sybase, Casandra, Mongo-DB, Flat files, API’s and different Hadoop concepts in Bigdata (ecosystems like Hive, Pig, Sqoop, and Map Reduce) Working knowledge of Java is preferred Advanced knowledge of ETL, including the ability to read and write efficient, robust code, follow or implement best practices and coding standards, design implement common ETL strategies (CDC, SCD, etc.), and create reusable maintainable jobs Solid background in database systems (such as Oracle, SQL Server, Redshift and Salesforce) along with strong knowledge of PLSQL and SQL Knowledge and hands on of Unix commands and Shell Scripting Good knowledge of SQL, including the ability to write stored procedures, triggers, functions etc.
Posted 1 month ago
0 years
0 Lacs
India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification: BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
2.0 years
6 - 8 Lacs
Hyderābād
On-site
Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: In this role primary responsibility is to perform testing for Cloud Native Database features. This includes understanding feature specifications early in the development cycle to create test specifications, test cases, and test plans under the guidance of feature test architects. This is a key testing position accountable to perform end-to-end testing of complex database features including functional, regression and performance testing ensuring that the features are delivered with high quality in committed timelines. Who you will work with: You will join a team of dedicated professionals who strive to deliver high quality features with passion, excellence, collaboration, and innovation. You will have the opportunity to express your creativity and originality in the projects. What Makes You a Qualified Candidate: BE/B.Tech/M.Tech degree in Computer Science/Information Technology or related field with 2+ years of programming experience and/or software testing experience. Strong scripting knowledge in Shell/Python Experience in working with Relational Databases (RDBMS) and SQL Experience with cloud-based development and systems, having worked on projects involving AWS, Azure and Google Cloud platforms What You will bring: Primarily responsible for the end-to-end testing of cloud database features Ability to understand features and functionalities from the design documents Ability to write feature test specifications, develop and automate test cases Testing the bug fixes, provide assistance in debugging and isolating the issues Ability to learn, adapt new technologies and attention to detail Working experience in Agile Software Development with exposure to test driven development Proven record in completing/delivering features Strong verbal and written communication skills Strong interpersonal skill for effective teamwork #LI-SK3
Posted 1 month ago
0 years
0 Lacs
Hyderābād
Remote
Job Summary: We are seeking a highly analytical and experienced Data Modeler and Analyst to play a pivotal role in designing, developing, and maintaining our enterprise data models and providing insightful data analysis. The ideal candidate will bridge the gap between business needs and technical solutions, ensuring our data architecture is robust, scalable, and accurately reflects our organizational data requirements. This role requires a strong understanding of data modeling principles, excellent SQL skills, and the ability to translate complex data into actionable insights for various stakeholders. Key Responsibilities: Data Modeling: Design, develop, and maintain conceptual, logical, and physical data models for various data initiatives (e.g., data warehouses, data marts, operational data stores, transactional systems). Work closely with business stakeholders, subject matter experts, and technical teams to gather and understand data requirements, translating them into accurate and efficient data models. Implement best practices for data modeling, including normalization, denormalization, dimensional modeling (star schema, snowflake schema), and data vault methodologies. Create and maintain data dictionaries, metadata repositories, and data lineage documentation. Ensure data model integrity, consistency, and compliance with data governance standards. Perform data profiling and data quality assessments to understand source system data and identify modeling opportunities and challenges. Data Analysis: Perform in-depth data analysis to identify trends, patterns, anomalies, and insights that can drive business decisions. Write complex SQL queries to extract, transform, and analyze data from various relational and non-relational databases. Develop and present clear, concise, and compelling reports, dashboards, and visualizations to communicate findings to technical and non-technical audiences. Collaborate with business units to define KPIs, metrics, and reporting requirements. Support ad-hoc data analysis requests and provide data-driven recommendations. Collaboration & Communication: Act as a liaison between business users and technical development teams (ETL developers, BI developers, DBAs). Participate in data governance initiatives, ensuring data quality, security, and privacy. Contribute to the continuous improvement of data architecture and data management practices. Mentor junior data professionals and share knowledge within the team. Tools: Informatica, Teradata, Axiom, SQL,Databricks Job Types: Full-time, Permanent, Fresher, Freelance Contract length: 18 months Pay: ₹300,000.00 - ₹28,000,000.00 per year Benefits: Paid time off Work from home Schedule: Day shift Work Location: In person
Posted 1 month ago
0 years
4 - 8 Lacs
Chennai
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification : BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data n’ Analytics – Data Strategy - Manager, Strategy and Transactions EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Manager - Data Strategy. The main objective of the role is to develop and articulate a clear and concise data strategy aligned with the overall business strategy. Communicate the data strategy effectively to stakeholders across the organization, ensuring buy-in and alignment. Establish and maintain data governance policies and procedures to ensure data quality, security, and compliance. Oversee data management activities, including data acquisition, integration, transformation, and storage. Develop and implement data quality frameworks and processes.The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. Discipline Data Strategy Key Skills Strong understanding of data models (relational, dimensional), data warehousing concepts, and cloud-based data architectures (AWS, Azure, GCP). Proficiency in data analysis techniques (e.g., SQL, Python, R), statistical modeling, and data visualization tools. Familiarity with big data technologies such as Hadoop, Spark, and NoSQL databases. Client Handling and Communication, Problem Solving, Systems thinking, Passion of technology, Adaptability, Agility, Analytical thinking, Collaboration Skills And Attributes For Success 10-12 years of total experience with 8+ years in Data Strategy and Architecture field Solid hands-on 6+ years of professional experience with designing and architecting of data warehouses/ data lakes on client engagements and helping create enhancements to a data warehouse Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) 5+ years’ experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 5+ years experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modelling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients To qualify for the role, you must have Master’s Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 12 to 14 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our Company At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You'll Do: Shape the Way the World Understands Data Are you ready to be at the forefront of a revolution? At Teradata, we're not just managing data; we're unlocking its hidden potential through the power of Artificial Intelligence and Machine Learning. As a key member of our innovative AI/ML team, you'll be architecting, building, and deploying cutting-edge software solutions that will literally transform languages within the Teradata Vantage platform – a cornerstone of our strategic vision and a powerhouse in the analytics world. Dive deep into the performance DNA of AI/ML applications. You'll be the detective, identifying and crushing bottlenecks to ensure our solutions not only scale massively but also deliver lightning-fast results. Your mission? To champion quality at every stage, tackling the unique and exhilarating challenges presented by AI/ML in the cloud. Become an integral part of a brilliant, tightly-knit team where collaboration isn't just a buzzword – it's how we create world-class, enterprise-grade software that pushes boundaries. You'll be a knowledge champion, diving into the intricacies of our domain, crafting compelling documentation, and sharing your expertise to inspire other teams. Here, you'll be a technical visionary, defining the standards and architecting the solutions that will shape our future. You'll contribute to the blueprint, refining high-level requirements and ensuring our technology not only meets needs but anticipates them. Your coding prowess in Python, Java, and Go will be instrumental in delivering high-impact software that performs flawlessly, ensures durability, optimizes cost, and strengthens security. Unleash your inner API artisan! We're looking for someone with a genuine passion for crafting incredibly simple yet powerfully functional APIs that will be the backbone of our intelligent systems. Step into an agile, dynamic environment that feels like a startup but with the backing of an industry leader. You'll thrive on rapidly evolving business needs, directly impacting our trajectory and delivering quality solutions with speed and precision. Get ready to explore uncharted territories, creatively solve complex puzzles, and directly contribute to groundbreaking advancements. Who You'll Work With: Join Forces with the Best Imagine collaborating daily with some of the brightest minds in the company – individuals who champion diversity, equity, and inclusion as fundamental to our success. You'll be part of a cohesive force, laser-focused on delivering high-quality, critical, and highly visible AI/ML functionality within the Teradata Vantage platform. Your insights will directly shape the future of our intelligent data solutions. You'll report directly to the inspiring Sr. Manager, Software Engineering, who will champion your growth and empower your contributions. What Makes You a Qualified Candidate: Skills in Action You bring 5+ years of industry experience in the exciting world of software development and operating software systems that can handle massive scale. Your mastery of Java with the Spring Framework —amplified by expertise in AI/ML, Kubernetes, microservices architecture, and DevOps methodologies—makes you a powerhouse poised to shape the future of technology. Additional consideration will be given for proficiency in Go, Python or other object-oriented languages, as a diverse skill set enhances overall effectiveness. You possess a strong command of AI/ML algorithms, methodologies, tools, and the best practices for building robust AI/ML systems. Your foundational knowledge of data structures and algorithms is rock-solid. Proficient in end-to-end application development with a focused approach on test-first TDD methodologies and comprehensive unit testing to ensure code reliability and quality. A strong advantage: hands-on experience with AI/ML orchestration tools such as LangChain and MLflow, streamlining the training, evaluation, and deployment of AI/ML models. In-depth knowledge of observability practices, with a focus on APM tools, to proactively monitor and strengthen system resilience. Proven experience with AWS, GCP, or Azure, alongside expertise in Docker and Kubernetes, enhances our cloud-native operation. Your analytical and problem-solving skills are sharp enough to cut through any challenge. Skilled in designing complex systems that balance scalability and simplicity, ensuring both efficiency and maintainability in architecture and implementation. " You're a team player with experience in group software development and a fluent user of version control tools, especially Git. Your debugging skills are legendary – you can track down and squash bugs with finesse. You possess excellent oral and written communication skills, capable of producing clear and concise runbooks and technical documentation for both technical and non-technical audiences. Familiarity with relational database management systems (RDBMS) like PostgreSQL and MySQL is a plus. What You'll Bring: Passion and Potential A Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field – your academic foundation is key. A genuine excitement for AI and large language models (LLMs) is a significant advantage – you'll be working at the cutting edge! Experience with Analytics? That's a huge plus in our data-driven environment. Familiarity with RDBMS – PostgreSQL, MySQL etc. – understanding data is crucial. You thrive in ambiguity, tackling undefined problems with an abstract and innovative mindset. Experience driving product vision to deliver long-term value for our customers is highly valued. You're ready to own the entire development lifecycle – from initial requirements to deployment and ongoing support. You're knowledgeable about open-source tools and technologies and know how to leverage and extend them to build innovative solutions. Passion for AI/ML, especially in building smart, agent-driven interfaces that feel human. Ownership mindset — you build, deploy, iterate, and scale with a long-term view. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 1 month ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our Company At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You'll Do: Shape the Way the World Understands Data Are you ready to be at the forefront of a revolution? At Teradata, we're not just managing data; we're unlocking its hidden potential through the power of Artificial Intelligence and Machine Learning. As a key member of our innovative AI/ML team, you'll be architecting, building, and deploying cutting-edge software solutions that will literally transform languages within the Teradata Vantage platform – a cornerstone of our strategic vision and a powerhouse in the analytics world. Dive deep into the performance DNA of AI/ML applications. You'll be the detective, identifying and crushing bottlenecks to ensure our solutions not only scale massively but also deliver lightning-fast results. Your mission? To champion quality at every stage, tackling the unique and exhilarating challenges presented by AI/ML in the cloud. Become an integral part of a brilliant, tightly-knit team where collaboration isn't just a buzzword – it's how we create world-class, enterprise-grade software that pushes boundaries. You'll be a knowledge champion, diving into the intricacies of our domain, crafting compelling documentation, and sharing your expertise to inspire other teams. Here, you'll be a technical visionary, defining the standards and architecting the solutions that will shape our future. You'll contribute to the blueprint, refining high-level requirements and ensuring our technology not only meets needs but anticipates them. Your expertise in Python, Java, Go, and MLOps will be key to delivering high-impact software solutions that perform seamlessly, ensure long-term durability, optimize costs, and uphold the highest security standards. Unleash your inner API artisan! We're looking for someone with a genuine passion for crafting incredibly simple yet powerfully functional APIs that will be the backbone of our intelligent systems. Step into an agile, dynamic environment that feels like a startup but with the backing of an industry leader. You'll thrive on rapidly evolving business needs, directly impacting our trajectory and delivering quality solutions with speed and precision. Get ready to explore uncharted territories, creatively solve complex puzzles, and directly contribute to groundbreaking advancements. Who You'll Work With: Join Forces with the Best Imagine collaborating daily with some of the brightest minds in the company – individuals who champion diversity, equity, and inclusion as fundamental to our success. You'll be part of a cohesive force, laser-focused on delivering high-quality, critical, and highly visible AI/ML functionality within the Teradata Vantage platform. Your insights will directly shape the future of our intelligent data solutions. You'll report directly to the inspiring Sr. Manager, Software Engineering, who will champion your growth and empower your contributions. What Makes You a Qualified Candidate: Skills in Action You bring 2+ years of industry experience in the exciting world of software development and operating software systems that can handle massive scale. Your Python skills are not just proficient; they're a must-have superpower! Bonus points for knowledge in Go, Java, or other object-oriented languages – the more tools in your arsenal, the better! You possess a strong command of AI/ML algorithms, methodologies, tools, and the best practices for building robust AI/ML systems. Your foundational knowledge of data structures and algorithms is rock-solid. Proficient in end-to-end application development with a focused approach on test-first TDD methodologies and comprehensive unit testing to ensure code reliability and quality. A strong advantage: hands-on experience with AI/ML orchestration tools such as LangChain and MLflow, streamlining the training, evaluation, and deployment of AI/ML models. You have hands-on experience with data validation, rigorous model evaluation, and performance testing that ensures our machine learning models are top-tier. Knowledge of containerization and orchestration tools like Docker and Kubernetes? That's a significant plus in our cloud-native world. Your analytical and problem-solving skills are sharp enough to cut through any challenge. You're a team player with experience in group software development and a fluent user of version control tools, especially Git. Your debugging skills are legendary – you can track down and squash bugs with finesse. You possess excellent oral and written communication skills, capable of producing clear and concise runbooks and technical documentation for both technical and non-technical audiences. Familiarity with relational database management systems (RDBMS) like PostgreSQL and MySQL is a plus. What You Bring: Passion and Potential A Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field – your academic foundation is key. A genuine excitement for AI and large language models (LLMs) is a significant advantage – you'll be working at the cutting edge! Experience with Analytics? That's a huge plus in our data-driven environment. Familiarity with RDBMS – PostgreSQL, MySQL etc. – understanding data is crucial. You thrive in ambiguity, tackling undefined problems with an abstract and innovative mindset. Experience driving product vision to deliver long-term value for our customers is highly valued. You're ready to own the entire development lifecycle – from initial requirements to deployment and ongoing support. You're knowledgeable about open-source tools and technologies and know how to leverage and extend them to build innovative solutions. Passion for AI/ML, especially in building smart, agent-driven interfaces that feel human. Ownership mindset — you build, deploy, iterate, and scale with a long-term view. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 1 month ago
0.0 - 8.0 years
0 Lacs
Panchkula, Haryana
On-site
Description Job Description We are looking for a skilled and experienced Lead/Senior ETL Engineer with 4–8 years of experience to join our data engineering team. In this role, you will be responsible for designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. You’ll also contribute to architectural decisions, lead delivery planning, and provide mentorship to team members. Your hands-on expertise in ETL tools, cloud platforms, and scripting will be key to building efficient, scalable, and reliable data solutions for enterprise-level implementations. Skills Key Skills Strong hands-on experience with ETL tools like SSIS, DataStage, Informatica, or Talend. Deep understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, Fact & Dimension tables. Proficient in working with relational databases: SQL Server, Oracle, Teradata, DB2, or MySQL. Solid scripting/programming skills in Python. Hands-on experience with cloud platforms such as AWS or Azure. Knowledge of middleware architecture and enterprise data integration strategies. Familiarity with reporting/BI tools like Tableau and Power BI. Strong grasp of data modeling principles and performance optimization. Ability to write and review high and low-level design documents. Strong communication skills and experience working with cross-cultural, distributed teams. Responsibilities Roles and Responsibilities Design and develop ETL workflows and data integration strategies. Create and review high and low-level designs adhering to best practices. Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. Coach and mentor junior engineers to support skill development and performance. Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. Participate in planning, estimations, and recruitment activities. Work on multiple projects simultaneously, ensuring quality and consistency in delivery. Experience in Sales and Marketing data domains. Exposure to reporting and analytics projects. Strong problem-solving abilities with a data-driven mindset. Ability to work independently and collaboratively in a fast-paced environment. Prior experience in global implementations and managing multi-location teams is a plus. Contacts Email: careers@grazitti.com Address: HSIIDC Technology Park, Plot No – 19, Sector 22, 134104, Panchkula, Haryana, India
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role : Sr. Data Engineer Exp : 5 to 8 Years Location : the Business Problem and the Relevant Data : Maintain an intimate understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture the relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting Data Management Systems : Leverage understanding of the business problem and the nature of the data to select appropriate data management system (Big Data, OLTP, OLAP, etc.) Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements Plan methods for archiving/deletion of and Experience : 5+ years developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake Experienced in Cloud based solutions using AWS/AZURE/GCP. Significant experience with big data processing and/or developing applications and data sources via Spark, etc. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, cross-functional environment Minimum of 2 years experience in any of the following : At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) Software development Ability to travel as needed (ref:hirist.tech)
Posted 1 month ago
5.0 years
0 Lacs
Telangana, India
On-site
Our Company At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. The Security Operations Analyst is responsible for monitoring, analyzing, and responding to cybersecurity incidents and threats promptly. This role is crucial in protecting the organization’s digital infrastructure, data, and assets by supporting daily security operations, investigating alerts, and enhancing the security posture through continuous improvement of detection and response capabilities. Work You’ll Do Monitor SIEM and security tools for suspicious activity and potential threats. Triage and analyze security alerts to determine impact and urgency. Investigate and respond to cybersecurity incidents, including malware infections, phishing, unauthorized access, and data exfiltration. Escalate significant incidents to senior analysts or incident response teams as needed. Maintain and tune security tools such as SIEM, EDR, IDS/IPS, and firewalls. Assist in rule creation and fine-tuning to reduce false positives and improve detection. Consume and correlate threat intelligence feeds with internal data. Identify indicators of compromise (IOCs) and proactively hunt for threats. Analyze logs from various sources (network, system, application) for anomalies. Correlate events across multiple data sets to uncover patterns and threats. Document incidents, response actions, and findings in incident management systems. Prepare regular reports on security posture, incident metrics, and threat trends. Assist in educating users on secure practices and common threats. What Makes You a Qualified Candidate Bachelor’s degree in Cybersecurity, Computer Science, Information Technology, or a related field. 2–5 years of experience in a security operations or SOC role. Experience with SIEM platforms (e. g. , Splunk, Microsoft Sentinel, QRadar, etc. ). Hands-on knowledge of security tools (e. g. , EDR, IDS, firewalls, threat intelligence platforms). Familiarity with common threat vectors, attack techniques (MITRE ATT&CK), and incident response processes. Working knowledge of TCP/IP, networking concepts, Windows/Linux logs, and cloud security. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 1 month ago
7.0 years
0 Lacs
Greater Kolkata Area
Remote
Job Title : Senior Data Engineer Azure, ETL, Snowflake. Experience : 7+ yrs. Location : Remote. Job Summary We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in ETL processes, Cloud data platforms (Azure), Snowflake, SQL, and Python scripting. The ideal candidate will have hands-on experience building robust data pipelines, performing data ingestion from multiple sources, and working with modern data tools like ADF, Databricks, Fivetran, and DBT. Key Responsibilities Develop and maintain end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake. Write optimized SQL queries, stored procedures, and views to transform and retrieve data. Perform data ingestion and integration from various formats including JSON, XML, Parquet, TXT, XLSX, etc. Work on data mapping, modelling, and transformation tasks across multiple data sources. Build and deploy custom connectors using Python, PySpark, or ADF. Implement and manage Snowflake as a data storage and processing solution. Collaborate with cross-functional teams to ensure code promotion and versioning using GitHub. Ensure smooth cloud migration and data pipeline deployment using Azure services. Work with Fivetran and DBT for ingestion and transformation as required. Participate in Agile/Scrum ceremonies and follow DevSecOps practices. Mandatory Skills & Qualifications 7 years of experience in Data Engineering, ETL development, or similar roles. Proficient in SQL with strong understanding of joins, filters, and aggregations. Solid programming skills in Python (Functions, Loops, API requests, JSON parsing, etc. Strong experience with ETL tools such as Informatica, Talend, Teradata, or DataStage. Experience with Azure Cloud Services, specifically : Azure Data Factory (ADF). Databricks. Azure Data Lake. Hands-on experience in Snowflake implementation (ETL or Storage Layer). Familiarity with data modelling, data mapping, and pipeline creation. Experience working with semi-structured/unstructured data formats. Working knowledge of GitHub for version control and code management. Good To Have / Preferred Skills Experience using Fivetran and DBT for ingestion and transformation. Knowledge of AWS or GCP cloud environments. Familiarity with DevSecOps processes and CI/CD pipelines within Azure. Proficiency in Excel and Macros. Exposure to Agile methodologies (Scrum/Kanban). Understanding of custom connector creation using PySpark or ADF. Soft Skills Strong analytical and problem-solving skills. Effective communication and teamwork abilities. Ability to work independently and take ownership of deliverables. Detail-oriented with a commitment to quality. Why Join Us? Work on modern, cloud-based data platforms. Exposure to a diverse tech stack and new-age data tools. Flexible remote working opportunity aligned with a global team. Opportunity to work on critical enterprise-level data solutions. (ref:hirist.tech)
Posted 1 month ago
7.0 - 14.0 years
0 Lacs
Greater Kolkata Area
On-site
Key Responsibilities Develop and optimize complex SQL queries, including joins (inner/outer), filters, and aggregations. Work with diverse datasets from multiple database sources, ensuring data quality and integrity. Leverage Python for data manipulation, including functions, iterations, API requests, and JSON flattening. Use Python to interpret, manipulate, and process data to facilitate downstream analysis. Design, implement, and optimize ETL processes and workflows. Manage data ingestion from various formats (e.g., JSON, Parquet, TXT, XLSX) using tools like Informatica, Teradata, DataStage, Talend, and Snowflake. Demonstrate expertise in Azure services, specifically ADF, Databricks, and Azure Data Lake. Create, manage, and optimize cloud-based data pipelines. Integrate data sources via Fivetran or custom connectors (e.g., PySpark, ADF). Lead the implementation of Snowflake as an ETL and storage layer. Ensure seamless data connectivity, including handling semi-structured/unstructured data. Promote code and manage changes across various environments. Proficient in writing complex SQL scripts, including stored procedures, views, and functions. Hands-on experience with Snowflake in multiple projects. Familiarity with DBT for transformation logic and Fivetran for data ingestion. Strong understanding of data modeling and data warehousing fundamentals. Experience with GitHub for version control and code Skills & Experience : 7 to 14 years of experience in Data Engineering, with a focus on SQL, Python, ETL, and cloud technologies. Hands-on experience with Snowflake implementation and data pipeline management. In-depth understanding of Azure cloud tools and services, such as ADF, Databricks, and Azure Data Lake. Expertise in designing and managing ETL workflows, data mapping, and ingestion from multiple data sources/formats. Proficient in Python for data interpretation, manipulation, and automation tasks. Strong knowledge of SQL, including advanced techniques such as stored procedures and functions. Experience with GitHub for version control and collaborative to Have : Experience with other cloud platforms (e.g., AWS, GCP). Familiarity with DataOps and continuous integration/continuous delivery (CI/CD) practices. Prior experience leading or mentoring teams of data engineers. (ref:hirist.tech)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough