Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 4 weeks ago
3 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role : SRE Manager - Techblocks India. Location :Hyderabad & Ahmedabad. Full-Time. 3 Days from office. The SRE Manager at Techblocks India will lead the reliability engineering function, ensuring infrastructure resiliency and optimal operational performance. This hybrid role blends technical leadership with team mentorship and cross-functional coordination. 10+ years total experience, with 3+ years in a leadership role in SRE or Cloud Operations. Deep understanding of Kubernetes, GKE, Prometheus, Terraform. Cloud : Advanced GCP administration. CI/CD : Jenkins, Argo CD, GitHub Actions. Incident Management : Full lifecycle, tools like OpsGenie. Knowledge of service mesh and observability stacks. Strong scripting skills (Python, Bash). BigQuery/Dataflow exposure for telemetry. Build and lead a team of SREs. Standardize practices for reliability, alerting, and response. Engage with Engineering and Product leaders. Establish and lead the implementation of organizational reliability strategies, aligning SLAs, SLOs, and Error Budgets with business goals and customer expectations. - Develop and institutionalize incident response frameworks, including escalation policies, on- call scheduling, service ownership mapping, and RCA process governance. Lead technical reviews for infrastructure reliability design, high-availability architectures, and resiliency patterns across distributed cloud services. Champion observability and monitoring culture by standardizing tooling, alert definitions, dashboard templates, and telemetry data schemas across all product teams. Drive continuous improvement through operational maturity assessments, toil elimination initiatives, and SRE OKRs aligned with product objectives. Collaborate with cloud engineering and platform teams to introduce self-healing systems, capacity-aware autoscaling, and latency-optimized service mesh patterns. Act as the principal escalation point for reliability-related concerns and ensure incident retrospectives lead to measurable improvements in uptime and MTTR. Own runbook standardization, capacity planning, failure mode analysis, and production readiness reviews for new feature launches. Mentor and develop a high-performing SRE team, fostering a proactive ownership culture, encouraging cross-functional knowledge sharing, and establishing technical career pathways. Collaborate with leadership, delivery, and customer stakeholders to define reliability goals, track performance, and demonstrate ROI on SRE is a global digital product engineering company with 16+ years of experience helping Fortune 500 enterprises and high-growth brands accelerate innovation, modernize technology, and drive digital transformation. From cloud solutions and data engineering to experience design and platform modernization, we help businesses solve complex challenges and unlock new growth opportunities. At TechBlocks, we believe technology is only as powerful as the people behind it. We foster a culture of collaboration, creativity, and continuous learning, where big ideas turn into real impact. Whether you're building seamless digital experiences, optimizing enterprise platforms, or tackling complex integrations, you'll be part of a dynamic, fast-moving team that values innovation and ownership. Join us and shape the future of digital transformation. (ref:hirist.tech) Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Position Title Data Scientist II Function/Group R&D/Packaging Location Mumbai Shift Timing Regular Role Reports to Sr. Manager, Global Knowledge Solutions Remote/Hybrid/in-Office Hybrid About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Haagen-Dazs, we have been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate. us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC), Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI), Global Shared Services (GSS), Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out https://www.generalmills.co.in We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview In partnership with our cross-functional partners, ITQ innovates and develops products that meet the ever-changing needs of our consumers and enables long-term business growth. We identify and develop technologies that shape and protect our businesses today and into the future. ITQ operates across three organizations: Global Applications, Capabilities COEs, and Shared Services & Operations For more details about General Mills please visit this Link Purpose of the role The Global Knowledge Services (GKS) organization catalyzes the creation, transfer, and application of knowledge to ensure ITQ succeeds at its mission of driving internal and external innovation, developing differentiated technology, and engendering trust through food safety and quality. The scientists in the Statistics and Analytics Program Area will collaborate with US and India GKS team members to deliver high value statistical work that advances ITQ initiatives in consumer product research, health and nutrition science, research and development, and quality improvement. The Data Scientist II in this program area will be responsible for: designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics, data science, and business intelligence across our organization leveraging GCP services. This role requires close collaboration with statisticians, data scientists, and BI developers to ensure timely, reliable, and quality data delivery that drives insights and decision-making. Key Accountabilities 70%of Time- Excellent Technical Work Design, develop, and optimize data pipelines and ETL/ELT workflows using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) Build and maintain data architecture that supports structured and unstructured data from multiple sources Work closely with statisticians and data scientists to provision clean, transformed datasets for advanced modeling and analytics Enable self-service BI through efficient data modeling and provisioning in tools like Looker, Power BI, or Tableau Implement data quality checks, monitoring, and documentation to ensure high data reliability and accuracy Collaborate with DevOps/Cloud teams to ensure data infrastructure is secure, scalable, and cost-effective Support and optimize workflows for data exploration, experimentation, and productization of models Participate in data governance efforts, including metadata management, data cataloging, and access controls 15%of Time- Client Consultation and Business Partnering Work effectively with clients to identify client needs and success criteria, and translate into clear project objectives, timelines, and plans. Be responsive and timely in sharing project updates, responding to client queries, and delivering on project commitments. Clearly communicate analysis, conclusions, insights, and conclusions to clients using written reports and real-time meetings. 10%of Time-Innovation, Continuous Improvement (CI), and Personal Development Learn and apply a CI mindset to work, seeking opportunities for improvements in efficiency and client value. Identify new resources, develop new methods, and seek external inspiration to drive innovations in our work processes. Continually build skills and knowledge in the fields of statistics, and the relevant sciences. 5% of Time-Administration Participate in all required training (Safety, HR, Finance, CI, other) and actively GKS, and ITQ meetings, events, and activities. Complete other administrative tasks as required. Minimum Qualifications Minimum Degree Requirements: Masters from an accredited university Minimum 6 years of related experience required Specific Job Experience Or Skills Needed 6+ years of experience in data engineering roles, including strong hands-on GCP experience Proficiency in GCP services like BigQuery, Cloud Storage, Cloud Composer (Airflow), Dataflow, Pub/Sub Strong SQL skills and experience working with large-scale data warehouses Solid programming skills in Python and/or Java/Scala Experience with data modeling, schema design, and performance tuning Familiarity with CI/CD, Git, and infrastructure-as-code principles (Terraform preferred) Strong communication and collaboration skills across cross-functional teams For Global Knowledge Services Ability to effectively work cross-functionally with internal/global team members. High self-motivation, with the ability to work both independently and in teams. Excels at driving projects to completion, with attention to detail. Ability to exercise judgment in handling confidential and proprietary information. Ability to effectively prioritize, multi-task, and execute tasks according to a plan. Able to work on multiple priorities and projects simultaneously. Demonstrated creative problem-solving abilities, attention to detail, ability to “think outside the box.” Preferred Qualifications Preferred Major Area of Study: Master’s degree in Computer Science, Engineering, Data Science, or a related field Preferred Professional Certifications: GCP Preferred 6 years of related experience Company Overview We exist to make food the world loves. But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day. We look for people who want to bring their best — bold thinkers with big hearts who challenge one other and grow together. Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next. Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design and implement scalable, efficient, and secure data pipelines on GCP, utilizing tools such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Collaborate with cross-functional teams (data scientists, analysts, and software engineers) to understand business requirements and deliver actionable data solutions. Develop and maintain ETL/ELT processes to ingest, transform, and load data from various sources into GCP-based data warehouses. Build and manage data lakes and data marts on GCP to support analytics and business intelligence initiatives. Implement automated data quality checks, monitoring, and alerting systems to ensure data integrity. Optimize and tune performance for large-scale data processing jobs in BigQuery, Dataflow, and other GCP tools. Create and maintain data pipelines to collect, clean, and transform data for analytics and machine learning purposes. Ensure data governance and compliance with organizational policies, including data security, privacy, and access controls. Stay up to date with new GCP services and features and make recommendations for improvements and new implementations. Mandatory Skill Sets GCP, Big query , Data Proc Preferred Skill Sets GCP, Big query , Data Proc, Airflow Years Of Experience Required 4-7 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 18 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 4 weeks ago
1 years
0 Lacs
India
Remote
Trainee Magento Developer Location: Remote Job Type: Full-Time Experience Level: 1+ years About Us Zeproc is a fast-growing e-commerce marketplace specializing in industrial supplies, empowering suppliers and businesses across Europe. We are passionate about solving operational inefficiencies and offering unparalleled value to our customers. We are building our technology team and looking for highly motivated people to join us on this journey. Be part of a team that is passionate about innovation, customer-centric design, and driving the future of e-commerce. About the Role We are looking for a Trainee Magento 2 Developer with at least 1 year of experience who is eager to grow in the field of eCommerce development. In addition to development responsibilities, this role involves active coordination with the Operations team for tasks such as Catalog Import, MIS reporting, and data validation. This is a great opportunity for someone looking to build both technical and cross-functional skills in a fast-paced, collaborative environment. Key Responsibilities Magento Development: Assist in the development and maintenance of Magento 2 websites. Perform basic customization and configuration of themes and modules. Support the implementation of new features and functionality as per project requirements. Debug and resolve issues under the guidance of senior developers. Operations Coordination : Work closely with the Operations team to assist with catalog imports, product data uploads, and updates using Magento admin and custom scripts. Perform regular QC of the application - both Unit Testing and Customer experience UI. Generate and share MIS reports as required by business teams (sales, inventory, catalog, etc.). Perform regular data quality checks and ensure accuracy in product listings. Support documentation and process improvement for catalog and operational workflows. Required Skills & Qualifications : Minimum 1 year of hands-on experience with Magento 2 (community or enterprise). Basic knowledge of PHP, MySQL, HTML/CSS, and JavaScript . Familiarity with Magento 2 admin panel , product configuration, and attribute management. Understanding of eCommerce workflows such as product lifecycle, inventory management, Magento EAV, etc. Strong Excel/Google Sheets skills for handling bulk data and generating reports. Ability to work collaboratively with technical and non-technical teams. Good written and verbal communication skills. Nice to Have: Exposure to Magento data import/export tools (CSV, DataFlow, or custom scripts). Experience with basic API usage or integrations. Familiarity with tools like Git , JIRA , or Trello . Experience working in a team following Agile or Scrum practices. What We Offer Competitive Compensation: Attractive salary with performance-based bonuses. Growth Environment: Work with a team of talented professionals in a collaborative culture that fosters innovation. Work-Life Balance: We are providing full-time WFH Impactful Projects: Contribute to industry-leading e-commerce platforms and high-impact projects. Why Join Us? Be part of a mission-driven company making strides in the e-commerce landscape. Work with cutting-edge tools and technologies to deliver best-in-class solutions. Collaborate with industry veterans and like-minded innovators. Enjoy a clear career progression path and recognition for your contributions. Application Process To apply, please share your CV and a brief cover letter detailing your relevant experience and certification, including links to your portfolio, GitHub profile, or recent Magento projects and certification links. Applications will be reviewed on a rolling basis. Early applications are encouraged. Please note: The candidate will go through a practical problem-solving and a technical test during the interview process. Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a skilled Senior Data Engineer to become a part of our dynamic team. In this role as a Senior Data Engineer, you will focus on projects involving data integration and ETL processes tailored for cloud-based environments. Your main tasks will include crafting and executing sophisticated data structures, while ensuring the integrity, accuracy, and accessibility of data. Responsibilities Design and execute sophisticated data structures for cloud environments Develop ETL workflows utilizing SQL, Python, and other pertinent technologies Maintain data integrity, reliability, and accessibility for all relevant parties Work with diverse teams to comprehend data integration needs and specifications Create and manage documentation such as technical details, data flow charts, and data mappings Enhance and monitor data integration workflows to boost performance and efficiency while maintaining data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Familiarity with Snowflake for data warehousing Background in cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Greetings! One of our esteemed client Japanese multinational information technology (IT) service and consulting company headquartered in Tokyo, Japan. The company acquired Italy -based Value Team S.p.A. and launched Global One Teams. Join this dynamic, high-impact firm where innovation meets opportunity — and take your career to new height s! 🔍 We Are Hiring: Lead Full Stuck Java Developer Position Functions or Responsibilities Exp - 8 to 12 years Java 11 Spring Boot development and support Skill using Java 11 and above Java Developer with AMQ/MQTT/OpenShift . Skill in REST API bases web application development on Redhat . Additional Skill required include Microservices on OpenShift, GKE, Cloud Endpoints Skill in using Queue (AMQ, MQTT). Support application for BL, DL, Integration and Services using Java Development of all CURD dataflow and business logic Provide the deployment support & documentations. Should possess the overall knowledge of application and functionality. Fosters open communication within and between teams Support minor design, fixes of the applications working with front-end and back-end team Provide the technical guidance to team and lead on issue resolution. Interested candidates, please share your updated resume along with the following details : Total Experience: Relevant Experience in Java Full Stack Development: Current Loc Current CTC: Expected CTC: Notice Period: 🔒 We assure you that your profile will be handle d with strict confiden tial ity. 📩 Apply now and be part of this incredible journey Thanks, Syed Mohammad!! syed.m@anlage.co.in Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Delhi, India
Remote
About Apply Digital Apply Digital is a global digital transformation partner for change agents. Leveraging expertise that spans Business Transformation Strategy, Product Design & Development, Commerce, Platform Engineering, Data Intelligence, Marketing Services, Change Management, and beyond, we enable our clients to modernize their organizations and deliver meaningful impact to their business and customers. Our 750+ team members have helped transform global companies like Kraft Heinz, NFL, Moderna, Lululemon, Dropbox, Atlassian, A+E Networks, and The Very Group. Apply Digital was founded in 2016 in Vancouver, Canada. In the past nine years, we have grown to nine cities across North America, South America, the UK, and Europe. At Apply Digital, we believe in the “ One Team ” approach, where we operate within a ‘pod’ structure. Each pod brings together senior leadership, subject matter experts, and cross-functional skill sets, all working within a common tech and delivery framework. This structure is underpinned by well-oiled scrum and sprint cadences, keeping teams in step to release often and retrospectives to ensure we progress toward the desired outcomes. Wherever we work in the world, we envision Apply Digital as a safe, empowered, respectful and fun community for people, every single day. Together, we work to embody our SHAPE (smart, humble, active, positive, and excellent) values and make Apply Digital a space for our team to connect, grow, and support each other to make a difference. Visit our Careers page to learn how we can unlock your potential. LOCATION: Apply Digital is a hybrid friendly organization with remote options available if needed. The preferred candidate should be based in (or within a location commutable to) the Delhi/NCR region of India , working in hours that have an overlap with the Eastern Standard Timezone (EST). About The Client In your initial role, you will support Kraft Heinz, a global, multi-billion-dollar leader in consumer packaged foods and a valued client of ours for the past three years. Apply Digital has a bold and comprehensive mandate to drive Kraft Heinz’s digital transformation . Through implementable strategies, cutting-edge technology, and data-driven innovation we aim to enhance consumer engagement and maximize business value for Kraft Heinz. Our composable architecture, modern engineering practices, and deep expertise in AI, cloud computing, and customer data solutions have enabled game-changing digital experiences. Our cross-functional team has delivered significant milestones, including the launch of the What's Cooking App, the re-building of 120+ brand sites in over 20 languages, and most recently, the implementation of a robust Customer Data Platform (CDP) designed to drive media effectiveness. Our work has also been recognized internationally and has received multiple awards . While your work will start with supporting Kraft Heinz, you will also have future opportunities to collaborate with the global team on other international brands. THE ROLE: Are you passionate about designing and optimizing data pipelines to support business intelligence and operational data needs? Do you enjoy working with cloud-based data architectures and collaborating with cross-functional teams to deliver scalable solutions? If so, you may be the right fit for our Data Engineer role. As a Data Engineer at Apply Digital, you will contribute to building and maintaining cloud-native data pipelines that support our Composable digital platforms. You will work with data engineers, backend developers, and product teams to develop efficient, scalable, and secure data solutions. This role requires a strong foundation in SQL, Python, and cloud data platforms (preferably Google Cloud Platform - BigQuery) along with experience in data pipeline orchestration tools.(Dagster and DBT). This role also requires strong English language proficiency and experience working with remote teams across North America and Latin America with clear communication and coordination across distributed teams, including our clients. WHAT YOU'LL DO: Design, develop, and maintain data pipelines for ETL/ELT/Streaming workflows. Collaborate with backend and platform engineers to integrate data solutions into cloud-native applications. Optimize data storage, retrieval, and processing for performance and cost efficiency. Operate cloud data infrastructure, primarily Google Cloud Platform (BigQuery, Cloud Storage, Pub/Sub). Work with analytics and product teams to define data models for reporting and business intelligence. Implement data security, privacy, and governance best practices. Monitor, troubleshoot, and enhance data pipeline reliability and performance. Maintain clear documentation for data pipelines, transformations, and data sources. Stay updated with best practices and emerging technologies in data engineering. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 3+ years of experience in data engineering, focusing on building scalable pipelines and cloud-native architectures. Strong SQL skills for data modeling, transformation, and optimization. Proficiency in Python for data processing and automation. Experience with cloud data platforms, particularly Google Cloud Platform (GCP). Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Familiarity with ETL/ELT tools such as DBT, Apache Beam, or Google Dataflow. Exposure to data pipeline orchestration tools like Dagster, Apache Airflow, or Google Cloud Workflows. Knowledge of data privacy, security, and compliance practices. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and Talon.One . #Promoted LIFE AT APPLY DIGITAL At Apply Digital, people are at the core of everything we do . We value your time, safety, and health, and strive to build a work community that can help you thrive and grow. Here are a few benefits we offer to support you: Location: Apply Digital is a hybrid friendly organization with remote options available if needed. The preferred candidate should be based in (or within a location commutable to) Delhi/NCR, with the ability to overlap with the US/NA times zones when required. Comprehensive Benefits: benefit from private healthcare coverage, contributions to your Provident fund, and a gratuity bonus after five years of service. Vacation policy: work-life balance is key to our team’s success, so we offer flexible personal time offer (PTO); allowing ample time away from work to promote overall well-being. Great projects: broaden your skills on a range of engaging projects with international brands that have a global impact. An inclusive and safe environment: we’re truly committed to building a culture where you are celebrated and everyone feels welcome and safe. Learning opportunities: we offer generous training budgets, including partner tech certifications, custom learning plans, workshops, mentorship, and peer support. Apply Digital is committed to building a culture where differences are celebrated, and everyone feels welcome. That’s why we value equal opportunity and nurture an inclusive workplace where our individual differences are recognized and valued. For more information, visit our website’s Diversity, Equity, and Inclusion (DEI) page. If you have special needs or accommodations at this stage of the recruitment process, please inform us as soon as possible by emailing us at careers@applydigital.com . Show more Show less
Posted 1 month ago
5 - 10 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a highly skilled and motivated Lead DS/ML engineer to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. We are seeking a highly skilled Data Scientist / ML Engineer with a strong foundation in data engineering (ELT, data pipelines) and advanced machine learning to develop and deploy sophisticated models. The role focuses on building scalable data pipelines, developing ML models, and deploying solutions in production to support a cutting-edge reporting, insights, and recommendations platform for measuring and optimizing online marketing campaigns. The ideal candidate should be comfortable working across data engineering, ML model lifecycle, and cloud-native technologies. Job Description: Key Responsibilities: Data Engineering & Pipeline Development Design, build, and maintain scalable ELT pipelines for ingesting, transforming, and processing large-scale marketing campaign data. Ensure high data quality, integrity, and governance using orchestration tools like Apache Airflow, Google Cloud Composer, or Prefect. Optimize data storage, retrieval, and processing using BigQuery, Dataflow, and Spark for both batch and real-time workloads. Implement data modeling and feature engineering for ML use cases. Machine Learning Model Development & Validation Develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experiment with different algorithms (regression, classification, clustering, reinforcement learning) to drive insights and recommendations. Leverage NLP, time-series forecasting, and causal inference models to improve campaign attribution and performance analysis. Optimize models for scalability, efficiency, and interpretability. MLOps & Model Deployment Deploy and monitor ML models in production using tools such as Vertex AI, MLflow, Kubeflow, or TensorFlow Serving. Implement CI/CD pipelines for ML models, ensuring seamless updates and retraining. Develop real-time inference solutions and integrate ML models into BI dashboards and reporting platforms. Cloud & Infrastructure Optimization Design cloud-native data processing solutions on Google Cloud Platform (GCP), leveraging services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow. Work on containerized deployment (Docker, Kubernetes) for scalable model inference. Implement cost-efficient, serverless data solutions where applicable. Business Impact & Cross-functional Collaboration Work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translate complex model insights into actionable business recommendations. Present findings and performance metrics to both technical and non-technical stakeholders. Qualifications & Skills: Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or a related field. Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: Experience: 5-10 years with the mentioned skillset & relevant hands-on experience Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: Experience with Graph ML, reinforcement learning, or causal inference modeling. Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. Experience with distributed computing frameworks (Spark, Dask, Ray). Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 month ago
0 - 5 years
30 - 45 Lacs
Bengaluru, Karnataka
Work from Office
Solution Design Data Integration & Transformation Report & Dashboard Development Performance Optimization Qualifications and Skills: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Proven experience of 4 to 5 years as a Power BI Architect or similar role, with a focus on designing and implementing Power BI solutions. Should have 8+ years of experience in Business Intelligence. Should have a good knowledge and prior experience on PBI services including OLS and RLS, Dataflow and datamart, Deployment pipelines, gateways. Strong proficiency in Power BI, including data modeling, DAX, Power Query, and report/dashboard development. Solid understanding of data integration and ETL processes. Familiarity with data warehouse concepts and multidimensional databases. Experience with SQL and proficiency in writing complex queries. Strong analytical and problem-solving skills. Job Type: Full-time Pay: ₹3,000,000.00 - ₹4,500,000.00 per year Work Location: In person
Posted 1 month ago
0 - 5 years
30 - 45 Lacs
Bengaluru, Karnataka
Work from Office
Solution Design Data Integration & Transformation Report & Dashboard Development Performance Optimization Qualifications and Skills: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Proven experience of 4 to 5 years as a Power BI Architect or similar role, with a focus on designing and implementing Power BI solutions. Should have 8+ years of experience in Business Intelligence. Should have a good knowledge and prior experience on PBI services including OLS and RLS, Dataflow and datamart, Deployment pipelines, gateways. Strong proficiency in Power BI, including data modeling, DAX, Power Query, and report/dashboard development. Solid understanding of data integration and ETL processes. Familiarity with data warehouse concepts and multidimensional databases. Experience with SQL and proficiency in writing complex queries. Strong analytical and problem-solving skills. Job Type: Full-time Pay: ₹3,000,000.00 - ₹4,500,000.00 per year Work Location: In person
Posted 1 month ago
8 - 12 years
25 - 40 Lacs
Hyderabad
Remote
Senior GCP Cloud Administrator Experience: 8 - 12 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : GCP, Identity and Access Management (IAM), BigQuery, SRE, GKE, GCP certification Good to have skills : Terraform, Cloud Composer, Dataproc, Dataflow, AWS Forbes Advisor (One of Uplers' Clients) is Looking for: Senior GCP Cloud Administrator who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 month ago
5 - 7 years
27 - 30 Lacs
Pune
Work from Office
Must-Have Skills: 5+ years of experience as a Big Data Engineer 3+ years of experience with Apache Spark, Hive, HDFS, and Beam (optional) Strong proficiency in SQL and either Scala or Python Experience with ETL processes and working with structured and unstructured data 2+ years of experience with Cloud Platforms (GCP, AWS, or Azure) Hands-on experience with software build management tools like Maven or Gradle Experience in automation, performance tuning, and optimizing data pipelines Familiarity with CI/CD, serverless computing, and infrastructure-as-code practices Good-to-Have Skills: Experience with Google Cloud Services (BigQuery, Dataproc, Dataflow, Composer, DataStream) Strong knowledge of data pipeline development and optimization Familiarity with source control tools (SVN/Git, GitHub) Experience working in Agile environments (Scrum, XP, etc.) Knowledge of relational databases (SQL Server, Oracle, MySQL) Experience with Atlassian tools (JIRA, Confluence, GitHub) Key Responsibilities: Extract, transform, and load (ETL) data from multiple sources using Big Data technologies Develop, enhance, and support data ingestion jobs using GCP services like Apache Spark, Dataproc, Dataflow, BigQuery, and Airflow Work closely with senior engineers and cross-functional teams to improve data accessibility Automate manual processes, optimize data pipelines, and enhance infrastructure for scalability Modify data extraction pipelines to follow standardized, reusable approaches Optimize query performance and data access techniques in collaboration with senior engineers Follow modern software development practices, including microservices, CI/CD, and infrastructure-as-code Participate in Agile development teams, ensuring best practices for software engineering and data management Preferred Qualifications: Bachelor's degree in Computer Science, Systems Engineering, or a related field Self-starter with strong problem-solving skills and adaptability to shifting priorities Cloud certifications (GCP, AWS, or Azure) are a plus Skills GCP Services, ,Dataproc,DataFlow.
Posted 1 month ago
7 - 9 years
25 - 27 Lacs
Pune
Work from Office
We are seeking a highly experienced Senior Java & GCP Engineer to lead the design, development, and deployment of innovative batch and data processing solutions. This role requires strong technical expertise, leadership abilities, and hands-on experience with Java and Google Cloud Platform (GCP). The ideal candidate will collaborate with cross-functional teams, mentor developers, and ensure the delivery of high-quality, scalable solutions. Key Responsibilities: Lead the design, development, and deployment of batch and data processing solutions. Provide technical direction for Java and GCP-based implementations. Mentor and guide a team of developers and engineers. Work with cross-functional teams to translate business requirements into technical solutions. Implement robust testing strategies and optimize performance. Maintain technical documentation and ensure compliance with industry standards. Required Skills & Experience: Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. Expertise in Java and its ecosystems. Extensive experience with GCP (Google Kubernetes Engine, Cloud Storage, Dataflow, BigQuery). 7+ years of experience in software development, with a focus on batch processing and data-driven applications. Strong knowledge of secure data handling (PII/PHI). Proven ability to write clean, defect-free code. 3+ years of leadership experience, mentoring and guiding teams. Excellent communication and teamwork skills. This is a fantastic opportunity for a technical leader who is passionate about scalable cloud-based data solutions and eager to drive innovation in a collaborative environment. Skills Java,Microservices,GCP.
Posted 1 month ago
5 years
0 Lacs
Trivandrum, Kerala, India
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+)
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu
Work from Office
Job Description Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 1 month ago
5 - 8 years
0 Lacs
Pune, Maharashtra, India
On-site
Required Skills: Experience of working with Serverless Computing, GCE, GKE, GAE and Google Cloud functionsGood understanding of microservices based architectureGood working knowledge on Apigee, GCP Cloud Storage (GCS), Cloud BigQuery, Cloud Dataflow, Dataproc Cloud BigTable, Anthos, Auto ML etc.Cloud Networks, VPC network(s) and access, Load Balancing, backup & recoveryUnderstanding of API management landscape. Strong knowledge of Apigee component architecture and implementation configurationsExperience in working with tools such as Ansible/Terraform (Infrastructure as code), Jenkins (DevOps), Jenkins for CI, Git for version ControlHands-on experience with related/complementary open source software platforms and languages (e.g. C++, Java, Python etc.)
Posted 1 month ago
5 years
0 Lacs
Pune, Maharashtra, India
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stackDesign, develop, test, deploy, maintain, and improve software.Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.)Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset.Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality.Participate in a tight-knit, globally distributed engineering team.Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality.Manage sole project priorities, deadlines, and deliverables.Research, create, and develop software applications to extend and improve on Equifax SolutionsCollaborate on scalability issues involving access to data and information.Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience5+ years of software engineering experience5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS5+ years experience with Cloud technology: GCP, AWS, or Azure5+ years experience designing and developing cloud-native solutions5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision.Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and othersUI development (e.g. HTML, JavaScript, Angular and Bootstrap)Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle.Agile environments (e.g. Scrum, XP)Relational databases (e.g. SQL Server, MySQL)Atlassian tooling (e.g. JIRA, Confluence, and Github)Developing with modern JDK (v1.7+)Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 1 month ago
2 - 4 years
5 - 8 Lacs
Pune
Work from Office
We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.
Posted 1 month ago
2 - 5 years
0 Lacs
Hyderabad, Telangana, India
On-site
Title: Java Software Engineer – II Location: Hyderabad (5days per Week)Passionate about technology and see the world a little differently than your peers. Everywhere you look, there’s possibility. Opportunity. Boundaries to push and challenges to solve. You believe software engineering changes how people live.At NCR VOYIX, we believe that, too. We’re one of the world’s first tech companies, and still going strong. Like us, you know the online and mobile worlds better than any other—and see patterns that no one else sees. Our software engineers write code that can survive under the pressure of hundreds of thousands of requests per minute.We are looking for talented engineers to join our expanding platform as a service team. Our platform as a service is responsible for providing the foundation for NCR VOYIX cloud-based products, and includes a variety of features and services similar to those found on Google Cloud Platform and Amazon AWS.We work with some of the smartest, nicest people you'll meet. People who work here say the problems they work on are enormously challenging, and that the team culture is the most supportive they have seen.Curious? Read on. We’re looking for software engineering talent like you. IN THIS ROLE, YOU CAN EXPECT TOPlay a key role as a SW developer on newly formed scrum teams focused on developing Cloud Platform that will serve each of our major industries – Retail, Restaurants and Payments. You will build and expand the services powering our API ecosystem, solving problems for a large community of fellow developers. Your key day-today responsibilities will include:Crafting clean, well-tested code using rigorous continuous delivery methodologies, including automated functional and non-functional testing.Participation in an enterprise open source community by producing quality project and API documentation, samples, and answering forum questions.Building large-scale applications using Java or similar languages, with a focus on high-performance, scalability and resiliency in a service-oriented environment. Must Have:3-5 years of Java software development experienceExcellent development skills with Java or another JVM languageExperience using relational and/or non-relational databasesExperience designing, implementing, and testing RESTful APIsFamiliarity with modern frameworks for building high-throughout, resilient microservicesUnderstanding of methodologies such as TDD, BDD, and some experience with tools and frameworks for automated testing.Some familiarity with distributed design patterns, high-volume data stores, and horizontal scaling techniquesDesire and ability to tackle problems both at the large scale (think hundred-node clusters) and the small scale (think individual atomic locks) Even Better if you have...Ability to learn existing and new applications and become familiar with them in a short amount of time to be able to "stand on your own"Background working on highly-available, high-transaction volume, fault-tolerant systemsFamiliarity with Spring Framework, Spring Cloud, node.jsParticipated in public open source projectsPrior experience using, or knowledge of Google Cloud DataFlow, Google Cloud Pub Sub, Elastic DB, Solr and PostGres.Strong foundation in developing cloud-based solutions using platforms such as GCP, Azure, AWS or Heroku Education Qualification:Must have Bachelor's degree (B.Tech) in Computer Science, Information Technology, Artificial Intelligence, Machine Learning.
Posted 1 month ago
5 years
0 Lacs
Noida, Uttar Pradesh, India
At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. Innovate with Impact: Design and develop software solutions that push the boundaries of what's possible, elevating our capabilities and delighting our customers.Collaborative Brilliance: Consult with product owners and business partners to define requirements and create software designs that hit the sweet spot between feasibility and excellence.Mentorship Magic: Share your expertise by mentoring and guiding less experienced team members through the intricate dance of software development, ensuring they become stars in their own right.Testing Trailblazer: Define scope, develop testing methods, and collaborate with the QA team to enhance our testing efforts. Your goal? Ensure our solutions stand up to the highest standards.Operational Maestro: Provide top-tier operational support, diagnose complex issues in production systems, and resolve incidents with the finesse of a seasoned performer.Tech Explorer: Dive into the world of new and alternate technologies, evaluating, recommending, and applying them. Your mission is to keep our team at the forefront of innovation. Job Qualifications Experience Aplenty: 5+ years of hands-on experience in applicable software development environments, showcasing your prowess and ability to excel.Educational Symphony: A Bachelor's degree is strongly preferred, demonstrating your commitment to continuous learning and growth.Tech Savvy: Should have demonstrated experience in Cloud environments like AWS, GCP, or Azure. Comparable knowledge of tools like Azure Pipelines, BigQuery, MFT, Vault, & DataFlow. Workflow management and orchestration tools such as Airflow. Experience with object function/object-oriented scripting languages including Java and Python. Working knowledge of Snowflake, and DataFlow a definite plus!Business Acumen: Translate business needs into technical requirements with finesse, showcasing your ability to balance technical excellence with customer satisfaction.Team Player: Collaborate seamlessly with the team, responding to requests in a timely manner, meeting individual commitments, and contributing to the collective success.Mentor Extraordinaire: Leverage your coaching and teaching skills to guide and mentor your fellow team members, fostering an environment of continuous improvement. Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates
Posted 1 month ago
5 - 8 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Qualification 3-5 Years of overall IT experience as a Data Engineer with hands on expertise in building data pipeline, data transformation and metadata creation involving large complex and disconnected data sets.Graduate in Computer Science, Statistics, Informatics, Information Systems or another quantitative fieldData Engineering Certification from GCP. Job Description Assemble large, complex data sets that meet functional / non-functional business requirements.Build processes supporting data transformation, data structures, metadata, dependency and workload management.Hands on in manipulating, processing and extracting value from large disconnected datasets.Experience in building and optimizing ‘big data’ data pipelines, architectures and data sets.Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.Experience supporting and working with cross-functional teams in a dynamic environment. Skills/Tools/Techniques Hands on proficiency with big data tools: Hadoop, Hive, GCP BigQuery, GCP DataProc, Python/ScalaNice to have experience in GCP cloud services: Google cloud storage, Big query, Spanner, Cloud Pub/Sub & ComposerExperience in CI/CD using Jenkins/GitlabNice to have experience with stream-processing systems: Dataflow etc.Experience with data pipeline and workflow management tools: Airflow, etc.Strong analytic skills related to working with unstructured datasets.
Posted 1 month ago
7 years
0 Lacs
Pune, Maharashtra, India
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 1 month ago
5 years
0 Lacs
Pune, Maharashtra, India
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
What you’ll doDesign, develop, and operate high scale applications across the full engineering stackDesign, develop, test, deploy, maintain, and improve software.Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.)Work across teams to integrate our systems with existing internal systems, Data Fabric Role Description 6+ years experience writing, debugging, and troubleshooting code in mainstream Core Java, SpringBoot, Microservices· Experience with Cloud technology: GCP, AWS, or Azure ( Added Advantage)· Design and build scalable and reliable data pipelines using Google Cloud services such as BigQuery, Dataflow, Dataproc, Pub/Sub and Avro.· Implement data ingestion and transformation processes for both real-time and batch data streams.· Develop and maintain data models and schemas that support business analytics and intelligence.· Optimize data retrieval and develop dashboards for data visualization and performance monitoring using Google Data Studio or similar tools.· Collaborate with cross-functional teams, including data scientists and business analysts, to define data requirements and improve data quality.· Automate data pipelines and integrate third-party data sources using APIs and cloud services.· Ensure compliance with data governance and security policies.· Troubleshoot and resolve issues in the data pipeline and provide operational support as needed QualificationsExcellent problem-solving and analytical skillsAbility to work collaboratively in a team environmentBachelor's degree in Computer Science, Engineering, or related field
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.
These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.
The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.
In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.
In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.
As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2