Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300075 Show more Show less
Posted 6 days ago
6.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 6-9 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300079 Show more Show less
Posted 6 days ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Opportunity: Full Stack Data Engineer We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at Ford. Responsibilities What You'll Do: ( Responsibilities) Data Pipeline Architect & Builder: Spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Dataform, to build robust and efficient data pipelines. End-to-End Integration Expert: Utilize your full-stack skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. GCP Data Solutions Leader : Leverage your deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that not only meet but exceed business needs and expectations. Data Governance & Security Champion : Implement and manage robust data governance policies, access controls, and security best practices, fully utilizing GCP's native security features to protect sensitive data. Data Workflow Orchestrator : Employ Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing best practices in Infrastructure as Code (IaC). Performance Optimization Driver : Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborative Innovator : Collaborate effectively with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Automation & Reliability Advocate : Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Effective Communicator : Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Continuous Learner : Stay ahead of the curve by continuously learning about industry trends and emerging technologies, proactively identifying opportunities to improve our data platform and enhance our capabilities. Business Impact Translator : Translate complex business requirements into optimized data asset designs and efficient code, ensuring that our data solutions directly contribute to business goals. Documentation & Knowledge Sharer : Develop comprehensive documentation for data engineering processes, promoting knowledge sharing, facilitating collaboration, and ensuring long-term system maintainability. Qualifications What You'll Bring: (Qualifications) Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). 5-7 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc. Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform. Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues. Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc). A passion for data, innovation, and continuous learning. Show more Show less
Posted 6 days ago
5.0 years
6 - 7 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role you will be Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP. Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs. Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows. Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks. Data Integration and Management: Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency. Manage and maintain data storage solutions in GCP (e.g., BigQuery, Cloud Storage) to support analytics and reporting. GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy. Collaborate with data analysts and data scientists to prepare data for analysis and reporting. Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention. Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs. Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies. Collaborate with security teams to implement data protection measures and address vulnerabilities. Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members. Conduct training sessions and workshops to share expertise and promote best practices within the team. Requirements To be successful in this role, you should meet the following requirements: Education: Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP. Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development. Technical Skills: Strong knowledge of GCP services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering. Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred. Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc. Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform. Analytical Skills:Strong problem-solving skills with a keen attention to detail. Ability to analyze complex data sets and derive meaningful insights. Benefits:Competitive salary and comprehensive benefits package. Opportunity to work in a dynamic and collaborative environment on cutting-edge data projects. Professional development opportunities to enhance your skills and advance your career. If you are a passionate data engineer with expertise in ETL processes and a desire to make a significant impact within our organization, we encourage you to apply for this exciting opportunity! You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 6 days ago
10.0 years
25 - 30 Lacs
Cochin
On-site
Job Role: Data Architect Experience: 10+ years Notice period: Immediate to 15 days Location: Trivandrum / Kochi Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Implement best practices for data security, privacy, and compliance in cloud environments. Integrate machine learning workflows with data pipelines and analytics tools. Define data governance frameworks and manage data lineage. Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimize cloud infrastructure for scalability, performance, and reliability. Mentor junior team members and ensure adherence to architectural standards. Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems. Conduct technical reviews, audits, and performance tuning for data solutions. Design solutions for multi-region and multi-cloud data architecture. Stay updated on emerging technologies and trends in data engineering and GCP. Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : Google Cloud Certification is Preferred. Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills. Job Types: Full-time, Permanent Pay: ₹2,500,000.00 - ₹3,000,000.00 per year Benefits: Paid time off Provident Fund Schedule: Monday to Friday UK shift Supplemental Pay: Performance bonus Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Required) License/Certification: Google Cloud Certification (Required) Willingness to travel: 100% (Required) Work Location: In person
Posted 6 days ago
7.0 years
30 Lacs
Cochin
On-site
Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Must Have skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills. Job Type: Full-time Pay: Up to ₹3,000,000.00 per year Application Question(s): How many years of experience do you have in total ? Do you have all skills as per the JD ? Current CTC ? Expected CTC ? Location ? Notice Period ? Work Location: In person
Posted 6 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
Remote
Role : Database Engineer Location : Remote Notice Period : 30 Days Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Basic Qualifications: Bachelor’s or Master’s degree in a Computer Science, Engineering or a related or related field of study 5+ Years - Ability to work effectively across organizations, product teams and business partners. 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories 5+ Years - Strong understating of Database concepts and experience with multiple database technologies – optimizing query and data processing performance. 5+ Years - Full Stack Data Engineering Competency in a public cloud – Google, MS Azure, AWS Critical thinking skills to propose data solutions, test, and make them a reality. 5+ Years - Highly Proficient in SQL, Python, Java, Scala, or Go (or similar) - Experience programming engineering transformation in Python or a similar language. 5+ Years Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion. Cloud native technologist Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases. User experience advocacy through empathetic stakeholder relationship. Effective Communication both internally (with team members) and externally (with stakeholders) Knowledge of Data Warehouse concepts – experience with Data Warehouse/ ETL processes Strong process discipline and thorough understating of IT processes (ISP, Data Security). Responsibilities Responsibilities: Interact with GDIA product lines and business partners to understand data engineering opportunities, tooling and needs. Collaborate with Data Engineering and Data Architecture to design and build templates, pipelines and data products including automation, transformation and curation using best practices Develop custom cloud solutions and pipelines with GCP native tools – Data Prep, Data Fusion, Data Flow, DBT and Big Query Operationalize and automate data best practices: quality, auditable, timeliness and complete Participate in design reviews to accelerate the business and ensure scalability Work with Data Engineering and Architecture and Data Platform Engineering to implement strategic solutions Advise and direct team members and business partners on Ford standards and processes. Qualifications Preferred Qualifications: Excellent communication, collaboration and influence skills; ability to energize a team. Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality Hands on experience in Python using libraries like NumPy, Pandas, etc. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI Experience with Teradata, Hadoop, Hive, Spark and other parts of legacy data platform Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. Data Governance concepts including GDPR (General Data Protection Regulation), CCPA (California Consumer Protection Act), PoLP and how these can impact technical architecture Show more Show less
Posted 6 days ago
5.0 - 10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description UST is looking for a talented GCP Data Engineer with 5 to 10 years of experience to join our team and play a crucial role in designing and implementing efficient data solutions on the Google Cloud Platform (GCP). The ideal candidate should possess strong data engineering skills, expertise in GCP services, and proficiency in data processing technologies, particularly PySpark. Responsibilities Data Pipeline Development: Design, implement, and optimize end-to-end data pipelines on GCP, focusing on scalability and performance. Develop and maintain ETL workflows for seamless data processing. GCP Cloud Expertise Utilize GCP services such as BigQuery, Cloud Storage, and Dataflow for effective data engineering. Implement and manage data storage solutions on GCP. Data Transformation With PySpark Leverage PySpark for advanced data transformations, ensuring high-quality and well-structured output. Implement data cleansing, enrichment, and validation processes using PySpark. Requirements Proven experience as a Data Engineer, with a strong emphasis on GCP. Proficiency in GCP services such as BigQuery, Cloud Storage, and Dataflow. Expertise in PySpark for data processing and analytics is a must. Experience with data modeling, ETL processes, and data warehousing. Proficiency in programming languages such as Python, SQL, or Scala for data processing. Relevant certifications in GCP or data engineering are plus. Skills GCP, PySpark Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At Swarovski, where innovation meets inspiration, our people desire to explore, experience and create. We are looking for a Data Engineer where you will get a chance to work in a rewarding role within a diverse team that is pushing boundaries. Be part of a truly iconic global brand, learn and grow with us. We’re bold and inventive, revealing astonishing things like no one else can. A world of wonder awaits you. About The Job Maintain and support data transformation pipelines from the Landing Zone to the Consumption Layer. Monitor the health and performance of data pipelines, infrastructure, and associated systems within GCP and SAP BW environments, ensuring seamless and efficient operations. Identify, diagnose, and resolve issues or anomalies in real-time, minimizing downtime and safeguarding data integrity. Manage and resolve support tickets related to data engineering operations, ensuring timely and effective responses to stakeholder needs. Collaborate with cross-functional teams to understand ongoing data requirements and provide necessary support for existing solutions, ensuring alignment with current business needs. Act as the primary contact for troubleshooting and resolving data-related issues raised by other departments. Regularly review the performance of existing data pipelines, identifying opportunities for improvement and implementing optimizations to enhance efficiency, performance, and scalability. Provide ongoing support to data scientists and analysts, ensuring consistent access to reliable data and the necessary tools for analysis and reporting. Support and maintain processes for integrating and transforming data within the Data Warehouse and across all Data Analytics platforms.About you We are looking for a unique and amazing talent, who brings along the following: Bachelor's or Master's degree in Computer Science, Engineering, or a related field 5 years of hands-on experience in data engineering, building and maintaining data pipelines and systems Strong proficiency in GCP (Google Cloud Platform), including services such as BigQuery, Dataflow, Dataform, Cloud Storage and Pub/Sub Solid programming skills in languages like Python, SQL Experience with data modeling, data warehousing, and ETL/ELT processes Familiarity with data governance, data privacy, and security practices Strong problem-solving and analytical skills, with the ability to handle complex data-related issues Excellent communication and collaboration skills to work effectively with cross-functional teams Self-motivated and able to work independently, as well as in a team-oriented environment Deep understanding of SAP modules Nice-to-have: Experience in SAP BW data engineering (SAP BW 3.x, 7.x, BW on HANA) About Swarovski Masters of Light Since 1895 Swarovski creates beautiful crystals-based products of impeccable quality and craftsmanship that bring joy and celebrate individuality. Founded in 1895 in Austria, the company designs, manufactures and sells the world's finest crystals, gemstones, Swarovski Created Diamonds and zirconia, jewelry, and accessories, as well as objects and home accessories. Swarovski Crystal Business has a global reach with approximately 2,400 stores and 6,700 points of sales in over 150 countries and employs more than 18,000 people. Together with its sister companies Swarovski Optik (optical devices) and Tyrolit (abrasives), Swarovski Crystal Business forms the Swarovski Group. A responsible relationship with people and the planet is part of Swarovski’s heritage. Today this legacy is rooted in sustainability measures across the value chain, with an emphasis on circular innovation, championing diversity, inclusion and self-expression, and in the philanthropic work of the Swarovski Foundation, which supports charitable organizations bringing positive environmental and social impact. Show more Show less
Posted 6 days ago
7.0 years
0 Lacs
Kochi, Kerala, India
On-site
Company Description Chadwick Professional Services specializes in IT Permanent and Temporary Staffing, Leadership/Executive Hiring, Recruitment Process Outsourcing (RPO), and Market Insight. Recognized by MSME, Start-up India, and Start-up Karnataka, Chadwick is dedicated to innovative solutions in the industry. Our goal is to help clients undertake crucial projects by providing and managing exceptional talent. We offer a spectrum of staffing solutions, from short-term to strategic, suitable for organizations across all sizes and locations. With a blend of advanced technology and an agile team of recruiters, we enable our clients to focus on maximizing business performance and productivity. Role Description We are seeking a full-time GCP Data Architect for an on-site role located in Kochi and Trivandrum. The GCP Data Architect will be responsible for designing, implementing, and managing data architecture solutions. Daily tasks include overseeing data governance, data modeling, ETL processes, and data warehousing. The role requires close collaboration with various teams to ensure that data architecture solutions meet the business's needs and strategic goals. Qualifications • 7+ years of experience in data architecture, with at least 3 years in GCP environments. • Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. • Strong experience in data warehousing, data lakes, and real-time data pipelines. • Proficiency in SQL, Python, or other data processing languages. • Experience with cloud security, data governance, and compliance frameworks. •Strong problem-solving skills and ability to architect solutions for complex data environments. • Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills. Show more Show less
Posted 6 days ago
0.0 - 3.0 years
0 Lacs
Kochi, Kerala
On-site
Job Role: Data Architect Experience: 10+ years Notice period: Immediate to 15 days Location: Trivandrum / Kochi Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Implement best practices for data security, privacy, and compliance in cloud environments. Integrate machine learning workflows with data pipelines and analytics tools. Define data governance frameworks and manage data lineage. Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimize cloud infrastructure for scalability, performance, and reliability. Mentor junior team members and ensure adherence to architectural standards. Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems. Conduct technical reviews, audits, and performance tuning for data solutions. Design solutions for multi-region and multi-cloud data architecture. Stay updated on emerging technologies and trends in data engineering and GCP. Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : Google Cloud Certification is Preferred. Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills. Job Types: Full-time, Permanent Pay: ₹2,500,000.00 - ₹3,000,000.00 per year Benefits: Paid time off Provident Fund Schedule: Monday to Friday UK shift Supplemental Pay: Performance bonus Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Required) License/Certification: Google Cloud Certification (Required) Willingness to travel: 100% (Required) Work Location: In person
Posted 6 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Python, Spark, Data Engineer, Cloudera, Onpremise, Azure, Snlowfow, Kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title: Lead Data Engineer Location: Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less
Posted 6 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Python, Apache Spark, Snowflake, data engineer, spark, kafka, azure, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title : Lead Data Engineer Location: Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 6 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. How will you make an impact in this role? We are looking for energetic, high-performing and highly skilled Java Full Stack Engineers with some GCP Cloud experience to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalization portfolio focused on delivering the next generation global marketing capabilities. This team is responsible for building products that power Merchant Offers personalization for Amex card members. Job Description: Demonstrated leadership in designing sustainable software products, setting development standards, automated code review process, continuous build and rigorous testing etc Ability to effectively lead and communicate across 3rd parties, technical and business product managers on solution design Primary focus is spent writing code, API specs, conducting code reviews & testing in ongoing sprints or doing proof of concepts/automation tools Applies visualization and other techniques to fast track concepts Functions as a core member of an Agile team driving User story analysis & elaboration, design and development of software applications, testing & builds automation tools Works on a specific platform/product or as part of a dynamic resource pool assigned to projects based on demand and business priority Identifies opportunities to adopt innovative technologies Design, develop, and deploy scalable applications on the Google Cloud Platform using Java 8 & above. In-depth understanding of Google Cloud Platform services and architecture. Proficiency in developing and deploying applications on GCP using services such as Compute Engine, App Engine, Cloud Functions, Big Query, Cloud SQL, DataProc, Dataflow etc. Experience with containerization technologies such as Docker and Kubernetes is a plus. Awareness of best practices for cloud-based application development, ensuring security, reliability, and efficiency. Troubleshoot and resolve issues related to application performance and functionality on cloud Qualification: A bachelor’s degree in computer science, computer engineering, other technical discipline, or equivalent work experience 7+ years of experience in software development; 3-5 years in leading teams of engineers Demonstrated experience with Agile or other rapid application development methods Demonstrated experience with object-oriented design and coding Full-Stack development exposure Demonstrated experience on these core technical skills (Mandatory) Core Java, Spring Framework, Spring boot, Java EE Relational Database (Postgres / MySQL / DB2 etc.) Data Serialization techniques (Ex: Avro) Cloud development (Micro-services) Parallel & distributed (multi-tiered) systems Application design, software development and automated testing Demonstrated experience on these additional technical skills (Nice to Have) Unix / Shell scripting GenNext languages: Python / Scala / Golang / Julia etc Message Queuing, Stream processing (Kafka) AJAX tools/ Frameworks. Web services, open API development, and REST concepts Experience with implementing integrated automated release management using tools/technologies/frameworks like Maven, Git, code/security review tools, Jenkins, Automated testing and Junit. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 6 days ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Senior Associate Job Description & Summary We are seeking a highly skilled Sailpoint Developer .If candidate has experience of 2-3 years, he/she must be Sailpoint Certified, above 3 years experience sailpoint certification is not mandatory but good to have. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary:. We are seeking an experienced Big Query Specialist with 3-4 years of hands-on experience in managing, analyzing, and integrating data within Google Cloud Platform. The ideal candidate will have a strong background in data exploration, ETL processes, and log management, along with proficiency in Python for data manipulation and exportation Responsibilities Big Query Setup and Modification: o Design, implement, and modify Big Query datasets and tables. o Optimize Big Query architecture for performance and cost-effectiveness. Data Analysis: o Querying Big Query: Writing and executing SQL queries to extract, transform, and load (ETL) data from Big Query. o Data Exploration: Using SQL to explore datasets, identify patterns, trends, and anomalies. o Data Interpretation: Analyzing results and drawing meaningful conclusions to inform business decisions. o Data Visualization: Creating reports and dashboards using tools like Looker Studio, Google Sheets, or other BI tools to communicate insights. o Data Quality: Identifying and addressing data quality issues, ensuring data accuracy and consistency. Log Ingestion and Modification: o Develop and manage processes for ingesting logs into Big Query tables. o Ensure data accuracy and consistency through effective data transformation techniques. Integration with Security Solutions: o Collaborate with security teams to integrate Big Query with other security solutions. o Implement data governance and security best practices within Big Query. Licensing and Module Management: o Understand and manage GCP licensing and module configurations related to Big Query. o Provide guidance on cost management and optimization strategies. Data Modeling and Management: Data Modeling: Understanding data structures and relationships within Big Query datasets. o Data Pipelines: Developing and maintaining data pipelines for ingesting, transforming, and loading data into Big Query. o Data Governance: Ensuring data security, privacy, and compliance with relevant regulations Data Exportation and Python Proficiency: o Export logs and data efficiently from Big Query using Python. o Develop scripts and tools for data manipulation and automation tasks. Optional Skills: o Data visualization skills (e.g., Tableau, Looker). o Basic programming skills (e.g., Python, R). o Experience with machine learning or predictive analytics. Mandatory Skill Sets 3-4 years of experience working with Google Big Query and GCP. Strong proficiency in SQL and Python programming. Experience with data integration, ETL processes, and data pipeline management. Familiarity with security solutions and data governance practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skill Sets Experience with other GCP services such as Google Cloud Storage, Dataflow, or Pub/Sub. Knowledge of machine learning concepts and Big Query ML. Certification in Google Cloud Platform (e.g., Professional Data Engineer). Years of experience required: 3-7 Years Education Qualification B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Google BigQuery Optional Skills Accepting Feedback, Accepting Feedback, Access Control Models, Access Control System, Access Management, Active Listening, Analytical Thinking, Authorization Compliance, Authorization Management Systems, Azure Active Directory, Cloud Identity and Access Management (IAM), Communication, Creativity, CyberArk Management, Cybersecurity, Embracing Change, Emotional Regulation, Empathy, Encryption Technologies, Federated Identity Management, ForgeRock Identity Platform, Identity and Access Management (IAM), Identity-Based Encryption, Identity Federation, Identity Governance Framework (IGF) {+ 22 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: GCP Data Engineer Experience: 3–4 Years Location: Gurgaon (On-site) Shift: Rotational (Night shift allowance applicable) Joiners: Immediate only Role Overview: We are looking for a skilled and motivated GCP Data Engineer with 3–4 years of experience to join our growing team in Gurgaon . This role involves working in rotational shifts , including night shifts , for which additional allowance will be provided . The ideal candidate should have a strong command over Python, SQL, GCP, and BigQuery . Key Responsibilities: Develop, manage, and optimize data pipelines and workflows on Google Cloud Platform (GCP) Write and optimize complex SQL queries to transform and extract data Build scalable and efficient solutions using BigQuery Collaborate with data scientists and business teams to understand requirements and deliver data-driven solutions Monitor and troubleshoot production jobs and ensure data quality Work in rotational shifts to support global data operations Required Skills: Proficient in Python for data manipulation and scripting Strong in SQL – writing, debugging, and optimizing queries Hands-on experience with Google Cloud Platform (GCP) Expertise in BigQuery for data warehousing and analytics Good understanding of data pipelines, ETL/ELT processes, and cloud data architecture Nice to Have: Experience working in a production environment with monitoring and alerting tools Knowledge of GCP services like Dataflow, Cloud Storage, and Pub/Sub Exposure to CI/CD pipelines and version control systems (e.g., Git) Additional Information: Location: Gurgaon (on-site role) Shift: Rotational (Night shift allowance will be provided) Notice Period: Immediate Joiners Preferred Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What you’ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of relevant software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred Show more Show less
Posted 6 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Get to know Okta Okta is The World’s Identity Company. We free everyone to safely use any technology—anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we’re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We’re building a world where Identity belongs to you. Job Description: Okta is looking for a Marketing Operations Manager to join the Marketing Data Operations and Technology team. Okta has ambitious growth plans and is striving for operational excellence across the underlying marketing technology infrastructure to support the wider go-to-market organization. This role will be instrumental in optimizing our marketing automation and CRM processes, ensuring data integrity, and driving lead lifecycle management operations. You will work closely with various teams across the organization (including Marketing Automation, Business Technology, and Digital Marketing) to maintain and optimize Okta’s Marketing Operations infrastructure. Our ideal candidate will be a creative problem-solver with a strong technical aptitude and a passion for leveraging our tools to help solve business problems. This role will be reporting into the Sr. Manager, Marketing Operations. Job Duties And Responsibilities : Oversee Marketo and Marketo integrations as a subject matter expert (including tool provisioning, supporting internal security initiatives, monitoring Marketo system health, and engaging with Marketo technical support teams). Develop and maintain comprehensive documentation including technical architecture, business requirements, and data dictionaries. Support and troubleshoot third party integration setup and dataflow into Marketo such as paid media, webinar platforms (On24, BrightTalk…etc.), content syndication vendors, and more. Lead tool technical support and drive tool optimization in Marketo integrated systems, including Splash, 6sense, Folloze, and more. Routinely audit, optimize, and manage the core operational processing programs and architecture within Marketo. Partner closely with the Marketing Automation & Analytics team to ensure that data collection and data transfer across Marketo and the Marketing Technology ecosystem support the marketing needs. Partner closely with Legal & Privacy to manage Marketo privacy and compliance programs through audits, updates, and documentation. Support extended teams like Marketing and Business Technology with Marketing Operations initiatives and inquiries. Skills & Experience: To successfully undertake this role, the individual will be expected to have: 7+ years relevant Marketing Operations experience at a fast-growing B2B SaaS company Required: 5+ years of experience of managing and working with Marketo at an admin capacity - Marketo Certified Expert preferred Required: Strong critical thinker and problem solver, with an eye for detail Required: Proven ability to collaborate effectively across functionally and geographically distributed teams Required: Experience of working with business stakeholders to understand existing workflows, business requirements and translate this into solution design and delivery. Required: Exceptional written and verbal communication skill with the ability to convey technical information in a clear and concise manner Preferred: Experience of working with Salesforce (preference for candidates who have worked directly with systems integrations with Salesforce). Candidates should be comfortable with the core Salesforce Object models Preferred: Candidates with exposure to a range of Marketing Technology applications: For example: Sales outreach platforms: Such as Outreach / SalesLoft. ABM platforms: Such as 6Sense / Folloze. Event tech platforms: such as On24 / BrightTalk / Rainfocus Data Enrichment Tools: Such as Leadspace/ Clay / ZoomInfo / Clearbit "This role requires in-person onboarding and travel to our Bengaluru, IN office during the first week of employment." What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/. Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/. Show more Show less
Posted 6 days ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Lead a team of Software Engineers to design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity. What Experience You Need Bachelor's degree or equivalent experience 10+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What Could Set You Apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 6 days ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Tech Labs is Hiring Data Modeler Experience – 7+ Years (4.5 - 5 years relevant atleast) Location - Pune & Bangalore Mode – Hybrid Notice – Immediate to 15 days (preferred) / Under 30 days (if profile is good enough) Share profile - suyash.verma@tdtl.world Primary skills : Data profiling Data Modelling Source to target Mapping SQL and Data ware housing concepts. Responsibilities : Collaboration & Communication : Partner with data engineers to implement ETL/ELT processes aligned with models and mappings. Liaise with analysts and business users to ensure models meet reporting needs. Explain complex data concepts to both technical and non-technical audiences. Data Governance & Quality : Promote and enforce modelling policy, standards, and best practices. Drive data quality and integrity within the warehouse. Technical Environment : Use GCP Big Query, Google Cloud Storage, and Dataflow for schema design, performance tuning, and cost optimization. Documentation : Maintain clear documentation for models, dictionaries, and lineage. Skills and Qualifications Proven experience as a Data Modeler in a data warehouse setting. Strong grasp of data warehousing principles and best practices. Essential: Proficient in data profiling techniques and tools. Essential: Hands-on expertise in dimensional modelling. Show more Show less
Posted 1 week ago
9.0 years
5 - 10 Lacs
Thiruvananthapuram
On-site
9 - 12 Years 1 Opening Trivandrum Role description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes: Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures of Outcomes: Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected: Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management : Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control and Review : Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development : Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement gathering and Analysis: Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management: Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management: Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting: In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation and Thought Leadership: Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support: Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management: Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design: Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples: Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples: Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments: Snowflake Architect Key Responsibilities: • Solution Design: Designing the overall data architecture within Snowflake, including database/schema structures, data flow patterns (ELT/ETL strategies involving Snowflake), and integration points with other systems (source systems, BI tools, data science platforms). • Data Modeling: Designing efficient and scalable physical data models within Snowflake. Defining table structures, distribution/clustering keys, data types, and constraints to optimize storage and query performance. • Security Architecture: Designing the overall security framework, including the RBAC strategy, data masking policies, encryption standards, and how Snowflake security integrates with broader enterprise security policies. • Performance and Scalability Strategy: Designing solutions with performance and scalability in mind. Defining warehouse sizing strategies, query optimization patterns, and best practices for development teams. Ensuring the architecture can handle future growth in data volume and user concurrency. • Cost Optimization Strategy: Designing architectures that are inherently cost-effective. Making strategic choices about data storage, warehouse usage patterns, and feature utilization (e.g., when to use materialized views, streams, tasks). • Technology Evaluation and Selection: Evaluating and recommending specific Snowflake features (e.g., Snowpark, Streams, Tasks, External Functions, Snowpipe) and third-party tools (ETL/ELT, BI, governance) that best fit the requirements. • Standards and Governance: Defining best practices, naming conventions, development guidelines, and governance policies for using Snowflake effectively and consistently across the organization. • Roadmap and Strategy: Aligning the Snowflake data architecture with overall business intelligence and data strategy goals. Planning for future enhancements and platform evolution. • Technical Leadership: Providing guidance and mentorship to developers, data engineers, and administrators working with Snowflake. • Key Skills: • Deep understanding of Snowflake's advanced features and architecture. • Strong data warehousing concepts and data modeling expertise. • Solution architecture and system design skills. • Experience with cloud platforms (AWS, Azure, GCP) and how Snowflake integrates. • Expertise in performance tuning principles and techniques at an architectural level. • Strong understanding of data security principles and implementation patterns. • Knowledge of various data integration patterns (ETL, ELT, Streaming). • Excellent communication and presentation skills to articulate designs to technical and non-technical audiences. • Strategic thinking and planning abilities. Looking for 12+ years of experience to join our team. Skills Snowflake,Data modeling,Cloud platforms,Solution architecture About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
5.0 - 7.0 years
5 - 8 Lacs
Thiruvananthapuram
On-site
Trivandrum India Technology Full time 6/9/2025 J00168258 Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What you will do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What experience you need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What could set you apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2