Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2 - 5 years
5 - 15 Lacs
Gurgaon, Noida
Work from Office
Strong understanding of data governance, data quality, and data management principles Experience with data profiling, data quality metrics, and data validation techniques data governance tools such as data catalogs and metadata management systems Required Candidate profile Minimum 2 years of experience in data stewardship in any domain Strong understanding of data governance, data quality, and data management principles and practices
Posted 3 months ago
3 - 6 years
8 - 12 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Hi We are hiring for Leading ITES Company for Clinical Data Manager Profile. Role & responsibilities: Candidate should have 2-5 years of experience of CDM with experience in Conduct Scope of work • Perform day-to-day Clinical Data Management activities. • Work and coordinate with the team to perform data management activities and deliver an error-free quality database in accordance with the data management plan and regulator standards. • Read and understand the study protocol and the timelines. • Perform test data entry in the TEST environment, data listing review, data reconciliation, and query management tasks. Escalate/Action discrepancy in the clinical data as appropriate. • Perform external checks to handle manual discrepancies and action the same. • Ensure an error-free, quality data with no open queries. • Escalate any discrepancy in the clinical data to the study lead as appropriate. • Timely completion of training • Any other tasks deemed appropriate • To perform medical data collection and analysis of Prostate Cancer Data using databases like HIS/ EMR (Electronic Medical Record) and Caisis, Rave, CDM (startup, closeout, conduct) • Client interaction and meetings. • Bringing up new ideas and executing new plans to cope with the backlog. • Training new team members as and when required. To Apply WhatsApp 'Hi' @ 9151555419 and Follow the Steps Below: a) For Position in Mumbai Search : Clinical Data Manager Mumbai (Job Code # 205) b) For Position in Pune Search : Clinical Data Manager Pune (Job Code # 206) C) For Position in Bangalore Search : Clinical Data Manager Bangalore (Job Code # 207)
Posted 3 months ago
8 - 13 years
40 - 80 Lacs
Pune
Work from Office
What you ll do: "There is no better time to join Eaton than in this exciting era of power management. Were reimagining innovation by adapting digital technologies connected devices, data models, and insights to transform power management for safer, more sustainable, and more efficient power use. Our teams are collaborating to build the best digital solutions for our customers. We are looking forward to Data Engineer based in Pune, India. In Eaton, making our work exciting, engaging, and meaningful; ensuring safety, health, and wellness; and being a model of inclusion & diversity are already embedded in who we are - it s in our values, part of our vision, and our clearly defined aspirational goals. This exciting role offers an opportunity to: Eaton Corporation s Center for Intelligent Power has an opening for a Data Engineer As a Data Engineer, you will be responsible for designing, developing, and maintaining our data infrastructure and systems. You will collaborate with cross-functional teams to understand data requirements, implement data pipelines, and ensure the availability, reliability, and scalability of our data solutions. You can program in several languages and understands the end-to-end software development cycle including CI/CD and software release. you will also be responsible for developing and maintaining Power BI reports and dashboards, ensuring they meet business requirements and provide actionable insights. You will work closely with stakeholders to gather requirements and deliver high-quality data visualizations that drive decision-making " "* Design, develop, and maintain scalable data pipelines and data integration processes to extract, transform, and load (ETL) data from various sources into our data warehouse or data lake. * Collaborate with stakeholders to understand data requirements and translate them into efficient and scalable data engineering solutions. * Optimize data models, database schemas, and data processing algorithms to ensure efficient and high-performance data storage and retrieval. * Implement and maintain data quality and data governance processes, including data cleansing, validation, and metadata management. * Work closely with data scientists, analysts, and business intelligence teams to support their data needs and enable data-driven decision-making. * Develop and implement data security and privacy measures to ensure compliance with regulations and industry best practices. * Monitor and troubleshoot data pipelines, identifying and resolving performance or data quality issues in a timely manner. * Stay up to date with emerging technologies and trends in the data engineering field, evaluating and recommending new tools and frameworks to enhance data processing and analytics capabilities. * Build insights using various BI tools - Power BI * Collaborate with infrastructure and operations teams to ensure the availability, reliability, and scalability of data systems and infrastructure. * Mentor and provide technical guidance to junior data engineers, promoting best practices and knowledge sharing. "" " Qualifications: "Required: Bachelors degree from an accredited institution " 3+ years of experience in Data Engineering and Power BI 3+ years of experience in data analytics Skills: Apache Spark, Python Azure experience (Data Bricks, Docker, Function App) Git Working knowledge of Airflow Knowledge of Kubernetes and Docker Power BI: o Data Visualization: Proficient in creating interactive and visually appealing dashboards and reports using Power BI. o Data Modeling: Experience in designing and implementing data models, including relationships, hierarchies, and calculated columns/measures. o DAX (Data Analysis Expressions): Strong knowledge of DAX for creating complex calculations and aggregations. o Power Query: Skilled in using Power Query for data transformation and preparation. o Integration: Ability to integrate Power BI with various data sources such as SQL databases, Excel, and cloud services. o Performance Optimization: Experience in optimizing Power BI reports for performance and scalability. o Security: Knowledge of implementing row-level security and managing user access within Power BI. o Collaboration: Experience in sharing and collaborating on Power BI reports and dashboards within an organization. o Best Practices: Familiarity with Power BI best practices and staying updated with the latest features and updates. Proficient with using Power BI Background in SQL and experience working with relational databases. Bachelor Degree in Computer Science or Software Engineering or Information Technoogy Experience on Cloud Development Platforms - Azure & AWS and their associated data storage options Experience on CI/CD (Continuous Integration/Delivery) i. e. Jenkins, GIT, Travis-CI Virtual build environments (Container, VMs and Microservices) and Container orchestration - Docker Swarm, Kubernetes/Red Hat Openshift. Relational & non-relational database systems - SQL, Postgres SQL, NoSQL, MongoDB, CosmosDB "
Posted 3 months ago
3 - 5 years
5 - 7 Lacs
Mumbai
Work from Office
The Strategy & Consulting Global Network Song Practice Job Title - Content Writer + Analyst + The Strategy & Consulting Global Network Song Practice Management Level: 11 - Analyst Location: Mumbai Must have skills: Content creation (including static & video assets) Good to have skills: Basic knowledge of SEO principles, including keyword research and optimization Knowledge of digital ad platforms like Google Ads, Meta Ads, or LinkedIn Ads Familiarity with social media analytics tools to gauge campaign performance Job Summary We are seeking a Content Specialist to create high-quality, engaging, and SEO-optimized content across various channels, including blogs, social media, and marketing materials. This role involves editing, research, and content optimization to ensure clarity, accuracy, and brand consistency. You will collaborate with cross-functional teams, analyze content performance, and manage multiple projects while adhering to brand and compliance standards. Join our team of SONG who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challenges? Do you want to design, build and implement strategies to enhance business performance? Does working in an inclusive and collaborative environment spark your interest? Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consulting's Customer, Sales & Service practice. The Practice A Brief Sketch The practice is aligned to the Global Network Song Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Content Creation: Develop high-quality, engaging, and SEO-optimized content for various channels, including websites, blogs, whitepapers, social media, email campaigns, and more Create thought leadership articles, case studies, and product-centric content that aligns with brand goals Create brochures and point of sales materials Editing and Proofreading: Edit and proofread content to ensure grammatical accuracy, clarity, and consistency with brand tone and style Review and refine user-facing content to meet high-quality standards and eliminate errors Research and Analysis: Perform thorough research on industry trends, topics, and competitors to deliver credible and relevant content Analyze content performance metrics to iterate and improve strategies for better engagement Collaboration: Work closely with marketing, design, and product teams to ensure alignment between content and overall campaign goals Partner with designers to create visually appealing and content-rich assets, such as infographics and presentations Content Optimization: Optimize web content for search engines (SEO) using targeted keywords, metadata, and link-building strategies Adapt and repurpose content for different formats and platforms to maximize reach and impact Project Management: Manage multiple content projects simultaneously, ensuring timely delivery while maintaining quality Stay flexible and efficient in a fast-paced, deadline-driven environment Compliance and Documentation: Ensure all content complies with legal and brand standards, including accessibility and regulatory requirements Maintain organized project files, style guides, and documentation for easy handoff and collaboration Brin g your best skills forward to excel at the role: Content Writing & Editing SEO & Digital Marketing Research & Analytics Project Management & Collaboration Technical Proficiency (CMS, Analytics, Design Tools) Read about us. Your e xperience counts! Postgraduate or Graduate in Marketing/Advertising or BMM or Other Graduate Mass Media Degree Candidate with at least 3+ years of hands-on experience with content creation Must have skills: Content creation (including static & video assets) Good to have skills: Basic knowledge of SEO principles, including keyword research and optimization Knowledge of digital ad platforms like Google Ads, Meta Ads, or LinkedIn Ads Familiarity with social media analytics tools to gauge campaign performance What's in it for you? An opportunity to work on with key G2000 clients Potential to with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed into everything"from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your to grow your skills, industry knowledge and capabilities Opportunity to thrive in a that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization.
Posted 3 months ago
3 - 7 years
6 - 11 Lacs
Hyderabad
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively.. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities.. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati"Preferred Technical
Posted 3 months ago
3 - 8 years
8 - 10 Lacs
Hyderabad
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the optimization of Ataccama data management solutions. Collaborate with stakeholders to gather requirements Design data quality, data governance, and master data management solutions using Ataccama. Develop and maintain data quality rules, data profiling, and data cleansing processes within Ataccama Data Quality Center Preferred technical and professional experience Design and implement data matching and deduplication strategies using Ataccama Data Matching. Utilize Ataccama Data Catalog for metadata management, data lineage tracking, and data discovery. Provide expertise in integrating Ataccama with other data management tools and platforms within the organization's ecosystem
Posted 3 months ago
7 - 12 years
9 - 14 Lacs
Kolkata
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Product 360 (PIM) Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : as per accenture standards Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Informatica Product 360 (PIM). Your typical day will involve working with the PIM tool, collaborating with cross-functional teams, and delivering impactful data-driven solutions. Roles & Responsibilities: Design, build, and configure applications using Informatica Product 360 (PIM) to meet business process and application requirements. Collaborate with cross-functional teams to identify and prioritize requirements, ensuring that solutions are aligned with business needs. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Provide technical guidance and support to junior team members, ensuring that best practices are followed and that solutions are delivered on time and within budget. Stay updated with the latest advancements in Informatica Product 360 (PIM) and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience with Informatica Product 360 (PIM). Strong understanding of data modeling, data integration, and data quality concepts. Experience with ETL tools such as Informatica PowerCenter. Experience with SQL and relational databases such as Oracle, SQL Server, and MySQL. Experience with web services and APIs, including SOAP and REST. Experience with Agile development methodologies. Good To Have Skills:Experience with cloud-based data integration platforms such as Informatica Cloud. Experience with master data management (MDM) solutions. Experience with data governance and metadata management tools. Experience with data profiling and data quality tools. Experience with data visualization tools such as Tableau or Power BI. Additional Information: The candidate should have a minimum of 7.5 years of experience in Informatica Product 360 (PIM). The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Noida office. Qualifications as per accenture standards
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Design, develop, and maintain conceptual, logical, and physical data models to support business needs and objectives across various domains.Collaborate with stakeholders, including business analysts, data architects, and application developers, to gather and define data requirements.Implement best practices for data modeling, including normalization, de-normalization, and dimensional modeling, to support efficient data storage and retrieval.Develop and maintain data dictionaries, data lineage documentation, and metadata repositories to support data governance and standardization efforts.Perform data mapping and data profiling to ensure data quality and consistency across systems and environments.Work closely with ETL developers to design data integration strategies, ensuring seamless data flow between source and target systems.Knowledge of Star/Snowflake shcema.Knowledge of Cloud technologies Azure and/or GCP and/or AWS.Experience on Erwin Data Modeler Tool is must. Professional & Technical Skills: Must To Have Skills:Proficiency in Data Modeling Techniques and Methodologies Good To Have Skills:Experience with Oracle Procedural Language Extensions to SQL (PLSQL) Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 5 years of experience in Data Modeling Techniques and Methodologies This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education
Posted 3 months ago
4 - 6 years
6 - 10 Lacs
Bengaluru
Work from Office
Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 months ago
3 - 6 years
4 - 8 Lacs
Hyderabad
Work from Office
Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the optimization of Ataccama data management solutions. Collaborate with stakeholders to gather requirements and design data quality, data governance, and master data management solutions using Ataccama. Develop and maintain data quality rules, data profiling, and data cleansing processes within Ataccama Data Quality Center.. Design and implement data matching and deduplication strategies using Ataccama Data Matching Utilize Ataccama Data Catalog for metadata management, data lineage tracking, and data discovery Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
3 - 6 years
5 - 8 Lacs
Bengaluru
Work from Office
Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 months ago
3 - 6 years
4 - 7 Lacs
Bengaluru
Work from Office
Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 months ago
3 - 6 years
4 - 7 Lacs
Bengaluru
Work from Office
Responsibilities As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include:Comprehensive Feature Development and Issue Resolution:Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution:Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration:Being eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 months ago
3 - 6 years
5 - 8 Lacs
Bengaluru
Work from Office
ETL testing skill along with MongoDB, Tableau, SQL Familiarity with API lifecycle management tools and platforms (e.g., Apigee, Postman). Understanding of API governance, OpenAPI specifications, and metadata management. Ability to interpret API usage data and present actionable insights. Analyse API usage data to understand adoption trends and highlight areas for improvement. Develop insights from catalogue usage patterns to showcase value and drive strategic decisions. Good communication skills Have experience in stakeholder and risk management Take ownership of their work to deliver with quality and in committed timelines. Mandatory skills MONGODb,SQL,Tableau,Excel Desired/ Secondary skills ,AWS Max Vendor Rate in Per Day (Currency in relevance to work location) INR 6500/Day Work Location given in ECMS ID Bangalore, Chennai, Gurgaon BG Check (Before OR After onboarding) Before onboarding Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No ",
Posted 3 months ago
8 - 13 years
45 - 50 Lacs
Bengaluru
Work from Office
The Senior Data Engineer role involves leading and mentoring a team of data engineers, coordinating with other teams to manage and prioritize projects, and driving the strategy, roadmap, and execution of key data engineering initiatives The role requires understanding and translating business needs into data models, creating robust data pipelines, and developing and maintaining data lakes It also involves implementing and maintaining CI/CD pipelines for data solutions The candidate should be able to define and manage data load procedures, implement data strategies, and ensure robust operational data management systems Collaborating with stakeholders across the organization to understand their data needs and deliver solutions is also a key part of this role The ideal candidate will be proficient in big data tools like Hadoop, Hive, and Spark, programming languages such as Scala, Python, SQL, and have strong analytical skills related to working with structured and unstructured datasets This position is based in Visas offices in Bangalore, India, and presents an excellent opportunity for those looking to make a significant impact in the field of Data Engineering. Essential Functions: Experience in Requirement Gathering, Estimating, Managing large scale Data Engineering Projects. Requirement Analysis: Understand and translate business needs into data models supporting long-term solutions. Data Modeling: Work with the Business team to implement data strategies, build data flows and develop conceptual data models. Data Pipeline Design: Create robust and scalable data pipelines and data products in a variety of domains. Data Integration: Develop and maintain data lakes by acquiring data from primary and secondary sources and build scripts that will make our data evaluation process more flexible or scalable across data sets. Testing: Define and manage the data load procedures to reject or discard datasets or data points that do not meet the defined business rules and quality thresholds. Deployment: Implement data strategies and develop physical data models, along with the development teams, data analyst teams and information system team to ensure robust operational data management systems. Understanding of and ability to Implement Data Engineering principles and best practices. Team Leadership: Lead and mentor a team of data engineers. Coordinate with other teams to manage and prioritize projects. Drive strategy, roadmap, and execution of key data engineering initiatives. Stakeholder Management: Collaborate with stakeholders across the organization to understand their data needs and deliver solutions. Continuous Integration and Continuous Deployment (CI/CD): Implement and maintain CI/CD pipelines for data solutions, ensuring rapid, reliable, and streamlined updates to the data environment. This is a hybrid position. Expectation of days in office will be confirmed by your hiring manager. Technical Skills: Extensive experience in big data tools: Hadoop, Hive, and Spark. Proficiency in Scala, Python, SQL, and PySpark. Experience with Unix/Linux systems with scripting experience in Bash. Experience with data pipeline and workflow management tools like Airflow, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with cloud services AWS or Azure. Experience with stream-processing systems: Kafka, Spark-Streaming, etc. Strong analytic skills related to working with structured & unstructured datasets. Proficiency in managing and communicating data warehouse plans to internal clients. Experience with building processes supporting data transformation, data structures, metadata, dependency, and workload management. A successful history of manipulating, processing, and extracting value from large, disconnected datasets. Other Skills: Strong problem-solving skills. Excellent communication skills. Ability to work in a team. Detail-oriented and excellent organizational skills. Qualifications: 8+ years of work experience with a bachelor s degree or at least 5+ years of work experience with an Advanced degree (e.g. Master s, MBA, JD, MD) or 3+ years of work experience with a PhD degree Exposure to Financial Services/ Payments Industry Proven leadership skills with experience leading a team of data engineers. Leadership Competencies Exhibits intellectual curiosity and a desire for continuous learning. Demonstrates integrity, maturity, and a constructive approach to business challenges. Role model for the organization and implementing core Visa Values Respect for the Individuals at all levels in the workplace Strive for Excellence and extraordinary results. Use sound insights and judgments to make informed decisions in line with business strategy and needs. Leadership skills include an ability to allocate tasks and resources across multiple lines of businesses and geographies. Leadership extends to ability to influence senior management within and outside Analytics groups. Ability to successfully persuade/influence internal stakeholders for building best-in-class solutions.
Posted 3 months ago
3 - 6 years
6 - 10 Lacs
Bengaluru
Work from Office
Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 months ago
3 - 6 years
6 - 10 Lacs
Bengaluru
Work from Office
Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 months ago
3 - 6 years
6 - 10 Lacs
Bengaluru
Work from Office
Responsibilities If you are creating JD for an Associate program role:replace word "entry" in the first sentence below with word Associate. For example -> "As an entry level Software Developer..." will be changed to "As an Associate Software Developer..." As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for:Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product --- IF APPLICABLE review & complete fields in > Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform
Posted 3 months ago
2 - 6 years
12 - 16 Lacs
Bengaluru
Work from Office
Responsibilities Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning
Posted 3 months ago
8 - 13 years
40 - 80 Lacs
Bengaluru
Work from Office
ABOUT THIS ROLE: In this dynamic environment, where numbers reign supreme, its the delicate interplay of 1s and 0s that propels our data-driven pursuits. Software engineers at Nielsen embody the art of balancing precision with disruption, reliability with innovation. Nielsen, a tech juggernaut fueled by nearly a century of pioneering momentum, is set on revealing the future to the world and none of this would be achievable without the collaborative efforts of our talented engineers. RESPONSIBILITIES Build and optimize extensive data sets, big data pipelines, and architectures. Apply top-notch analytical skills to navigate unstructured datasets effectively. Develop processes supporting data transformation, workload management, data structures, dependencies, and metadata. Conduct root cause analyses on external and internal processes and data to uncover improvement opportunities and answer critical questions. Identify, design, and implement process enhancements, focusing on automation, usability, and scalability. Collaborate seamlessly with product and technology teams to design and validate the capabilities of the data platform. Establish and maintain high programming standards and practices across the ecosystem. Provide support and collaboration with cross-functional teams. Communicate effectively, presenting complex ideas in a clear and concise manner to diverse audiences. QUALIFICATIONS Possess 2+ years of hands-on experience with a diverse range of AWS technologies (e. g. , S3, Lambda, Glue, Athena, IAM, SQS, CloudWatch, CloudFormation). Proficient use of tools such as Git, Gitlab, and Jira. Demonstrated knowledge and proficiency (2+ years) in SQL & Python, Demonstrated knowledge and proficiency (1+ years) in technologies like Apache Superset, Tableau, Spark, Hive, Java and Hadoop. Demonstrated expertise in building and optimizing data pipelines within a distributed environment. Bonus points for experience in building Business Intelligence (BI) Tools. Back your work with either a master s degree, bachelor s degree, or equivalent work experience that showcases your prowess. Excellent communication skills, fluent English and ability to work with healthy overlap to US business hours is a must. Join us in reshaping the landscape of data possibilities, where your skills will not only be valued but will play a crucial role in defining what lies ahead. ABOUT NIELSEN We re in tune with what the world is watching, buying, and everything in between. If you can think of it, we re measuring it. We sift through the small stuff and piece together big pictures to provide a comprehensive understanding of what s happening now and what s coming next for our clients. Today s data is tomorrow s marketplace revelation. We like to be in the middle of the action. That s why you can find us at work in over 100 countries. From global industry leaders to small businesses, consumer goods to media companies, we work with them all. We re bringing in data 24/7 and the possibilities are endless. See what s next with us at Nielsen: careers. nielsen. com
Posted 3 months ago
4 - 9 years
19 - 23 Lacs
Pune
Work from Office
Job Summary As a Solutions Architect, the candidate will be responsible for understanding requirements and building solution architectures for the Data Engineering and Advanced Analytics Capability. The role will require a mix of technical knowledge and finance domain functional knowledge while the functional knowledge is not necessarily a must have. The candidate will apply best practices to create data architectures that are secure, scalable, cost-effective, efficient, reusable, and resilient. The candidate will participate in technical discussions, present their architectures to stakeholders for feedback, and incorporate their input. The candidate will evaluate, recommend, and integrate SaaS applications to meet business needs, and provide architectures for integrating existing Eaton applications or developing new ones with a cloud-first mindset. The candidate will offer design oversight and guidance during project execution, ensuring solutions align with strategic business and IT goals. As a hands-on technical leader, the candidate will also drive Snowflake architecture. The candidate will collaborate with both technical teams and business stakeholders, providing insights on best practices and guiding data-driven decision-making. This role demands expertise in Snowflake s advanced features and cloud platforms, along with a passion for mentoring junior engineers. Job Responsibilities Collaborate with data engineers, system architects, and product owners to implement and support Eatons data mesh strategy, ensuring scalability, supportability, and reusability of data products. Lead the design and development of data products and solutions that meet business needs and align with the overall data strategy, creating complex enterprise datasets adhering to technology and data protection standards. Deliver strategic infrastructure and data pipelines for optimal data extraction, transformation, and loading, documenting solutions with architecture diagrams, dataflows, code comments, data lineage, entity relationship diagrams, and metadata. Design, engineer, and orchestrate scalable, supportable, and reusable datasets, managing non-functional requirements, technical specifications, and compliance. Assess technical capabilities across Value Streams to select and align technical solutions following enterprise guardrails, executing proof of concepts (POCs) where applicable. Oversee enterprise solutions for various data technology patterns and platforms, collaborating with senior business stakeholders, functional analysts, and data scientists to deliver robust data solutions aligned with quality measures. Support continuous integration and continuous delivery, maintaining architectural runways for products within a Value Chain, and implement data governance frameworks and tools to ensure data quality, privacy, and compliance. Develop and support advanced data solutions and tools, leveraging advanced data visualization tools like Power BI to enhance data insights, and manage data sourcing and consumption integration patterns from Eatons data platform, Snowflake. Accountable for end-to-end delivery of source data acquisition, complex transformation and orchestration pipelines, and front-end visualization. Strong communication and presentation skills, leading collaboration with business stakeholders to deliver rapid, incremental business value/outcomes. Lead and participate in the planning, definition, development, and high-level design of solutions and architectural alternatives. Participate in solution planning, incremental planning, product demos, and inspect and adapt events. Plan and develop the architectural runway for products that support desired business outcomes. Provide technical oversight and encourage security, quality, and automation. Support the team with a techno-functional approach as needed. Qualifications: BE in Computer Science, Electrical, Electronics/ Any other equivalent Degree Education level required: 10 years Experience or knowledge of Snowflake, including administration/architecture. Expertise in complex SQL, Python scripting, and performance tuning. Understanding of Snowflake data engineering practices and dimensional modeling for performance and scalability. Experience with data security, access controls, and setting up security frameworks and governance (e. g. , SOX). Technical Skills Advanced SQL skills for building queries and resource monitors in Snowflake. Proficiency in automating Snowflake admin tasks and handling concepts like RBAC controls, virtual warehouses, resource monitors, SQL performance tuning, zero-copy clone, and time travel. Experience in re-clustering data in Snowflake and understanding micro-partitions. Excellent analysis, documentation, communication, presentation, and interpersonal skills. Ability to work under pressure, meet deadlines, and manage, mentor, and coach a team of analysts. Strong analytical skills for complex problem-solving and understanding business problems. Experience in data engineering, data visualization, and creating interactive analytics solutions using Power BI and Python. Extensive experience with cloud platforms like Azure and cloud-based data storage and processing technologies. Expertise in dimensional and transactional data modeling using OLTP, OLAP, NoSQL, and Big Data technologies. Familiarity with data frameworks and storage platforms like Cloudera, Databricks, Dataiku, Snowflake, dbt, Coalesce, and data mesh. Experience developing and supporting data pipelines, including code, orchestration, quality, and observability. Expert-level programming ability in multiple data manipulation languages (Python, Spark, SQL, PL-SQL). Intermediate experience with DevOps, CI/CD principles, and tools, including Azure Data Factory. Experience with data governance frameworks and tools to ensure data quality, privacy, and compliance. Solid understanding of cybersecurity concepts such as encryption, hashing, and certificates. Strong analytical skills to evaluate data, reconcile conflicts, and abstract information. Continual learning of new modules, ETL tools, and programming techniques. Awareness of new technologies relevant to the environment. Established as a key data leader at the enterprise level.
Posted 3 months ago
2 - 5 years
6 - 10 Lacs
Bengaluru
Work from Office
Appstore Quality teams mission is to automate all types of functional, non functional and compliance checks on 3P apps to enable north star vision of publishing apps under 5 hours. Certification tech team uses various ML/AI techniques to automatically detect violations in images and text metadata submitted by developers. Appstore certification team maintains a device farm consisting of hundreds of devices across multiple operating systems to automate functionality checks for 3P apps before they are published to the Catalog. Appstore certification tech is working on ambitious project to use AI to auto navigate a mobile app to detect inside app issues and violations. This role involves working closely with our principal engineer to design, develop and deploy AI based auto navigation of Apps. You will be working on AI based techniques involving computer vision, LLMs and deep understanding of android and other OS platforms. Key job responsibilities Do you want to solve business challenges through innovative technology? Do you enjoy working on scalable services technology in a team environment? Do you like working on industry-defining projects that move the needle? At Amazon, we hire the best minds in technology to innovate and build on behalf of our customers. Our Software Development Engineers (SDEs) use technology to solve complex problems and get to see the impact of their work first-hand. The challenges SDEs solve for at Amazon are big and influence millions of customers, sellers, and products around the world. We are looking for individuals who are passionate about creating new products, features, and services from scratch while managing ambiguity and the pace of a company where development cycles are measured in weeks, not years. If this sounds interesting to you, apply and come chart your own path at Amazon. About the team In Appstore, We entertain, and delight, hundreds of millions of people across devices with a vast selection of relevant apps, games, and services by making it trivially easy for developers to deliver . Appstore team enables the customer and developer flywheel on devices by enabling developers to seamlessly launch and manage their apps/ in-app content on Amazon. It helps customers discover, buy and engage with these apps on Fire TV, Fire Tablets and mobile devices. The technologies we build on vary from device software, to high scale services, to efficient tools for developers. - 3+ years of non-internship professional software development experience - 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience - Experience programming with at least one software programming language
Posted 3 months ago
3 - 5 years
10 - 14 Lacs
Bengaluru
Work from Office
Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform
Posted 3 months ago
6 - 10 years
10 - 14 Lacs
Bengaluru
Work from Office
Responsibilities As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management tools:Dataflow, Pub Sub, Hadoop, spark-streaming Version control system:GIT & Preferable knowledge of Infrastructure as Code:Terraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 3 months ago
6 - 11 years
12 - 16 Lacs
Bengaluru
Work from Office
Responsibilities Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2