Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 6.0 years
1 - 4 Lacs
Hyderābād
On-site
Job Title: Senior Data Analyst – AdTech (Team Lead) Location: Hyderabad Experience Level: 4–6 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About the Role: We are looking for a highly experienced and hands-on Senior Data Analyst (AdTech) to lead our analytics team. This role is ideal for someone with a strong background in log-level data handling , cross-platform data engineering , and a solid command of modern BI tools . You'll play a key role in building scalable data pipelines, leading analytics strategy, and mentoring a team of analysts. Key Responsibilities: Lead and mentor a team of data analysts, ensuring quality delivery and technical upskilling. Design, develop, and maintain scalable ETL/ELT pipelines using GCP tools (BigQuery, Dataflow, Cloud Composer, Cloud Functions, Pub/Sub). Ingest and process log-level data from platforms like Google Ad Manager, Google Analytics (GA4/UA), DV360, and other advertising and marketing tech sources. Build and optimize data pipelines from diverse sources via APIs, cloud connectors, and third-party tools (e.g., Supermetrics, Fivetran, Stitch). Integrate and manage data across multiple cloud platforms and data warehouses such as BigQuery, Snowflake, DOMO, and AWS (Redshift, S3). Own the creation of data models, data marts, and analytical layers to support dashboards and deep-dive analyses. Build and maintain scalable, intuitive dashboards using Looker Studio, Tableau, Power BI, or Looker. Partner with engineering, product, revenue ops, and client teams to gather requirements and drive strategic insights from data. Ensure data governance, security, and quality standards are followed across the analytics ecosystem. Required Qualifications: 4–6 years of experience in data analytics or data engineering roles, with at least 1–2 years in a leadership capacity. Deep expertise working with log-level AdTech data—Google Ad Manager, Google Analytics, GA4, programmatic delivery logs, and campaign-level data. Strong knowledge of SQL and Google BigQuery for large-scale data querying and transformation. Hands-on experience building data pipelines using GCP tools (Dataflow, Composer, Cloud Functions, Pub/Sub, Cloud Storage). Proven experience integrating data from various APIs and third-party connectors. Experience working with multiple data warehouses: Snowflake, DOMO, AWS Redshift, etc. Strong skills in data visualization tools: Looker Studio, Tableau, Power BI, or Looker. Excellent stakeholder communication and documentation skills. Preferred Qualifications: Scripting experience in Python or JavaScript for automation and custom ETL development. Familiarity with version control (e.g., Git), CI/CD pipelines, and workflow orchestration. Exposure to privacy regulations and consent-based data handling in digital advertising (GDPR, CCPA). Experience working in agile environments and managing delivery timelines across multiple stakeholders.
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements and the technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. #AI&D You’ll need to have… Bachelor’s degree or one or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in any DBMS Experience in Shell scripting, Spark, Scala. Experience in GCP, Cloud Composer and BigQuery Even better if you have one or more of the following Two or more years of relevant experience. Any relevant Certification on ETL/ELT developer. Certification in GCP-Data Engineer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less
Posted 5 days ago
3.0 years
0 - 0 Lacs
India
Remote
Location: Work From Office: Pal, Surat Work From Home: Ahmedabad, Gujarat About Us TPots (Shingala Digital Solutions) builds robust web applications and APIs for clients across industries. Join our team to craft scalable Laravel-powered backends and deliver top-notch solutions. Key Responsibilities Design, develop & maintain Laravel-based web applications and RESTful APIs Architect database schemas and optimize complex Eloquent queries Implement authentication, authorization, and role-based access control Integrate third-party services (payment gateways, CRMs, messaging APIs) Write clean, secure, testable code with PHPUnit and Laravel Dusk Troubleshoot bugs, performance bottlenecks, and security vulnerabilities Collaborate with Front-end, DevOps & QA in Agile sprints Participate in code reviews, mentoring junior developers Must-Have Qualifications 3+ years hands-on Laravel development (Laravel 6.x–10.x) Proficiency in PHP 7.4+, Composer, and modern PHP practices Strong MySQL/PostgreSQL skills, including indexing & query optimization Experience with API design (REST, JWT/OAuth2) and JSON serialization Familiarity with queues (Redis/RabbitMQ), task scheduling, and caching Solid understanding of MVC architecture, SOLID principles, and design patterns Version control with Git and collaborative branching workflows Good communication skills in English Nice-to-Have Experience with Livewire, Inertia.js or Vue.js integration Knowledge of Docker, CI/CD pipelines (GitHub Actions, GitLab CI) Familiarity with AWS services (RDS, S3, Lambda) or DigitalOcean Exposure to automated testing tools and code-quality linters Prior work on multi-tenant or SaaS platforms What We Offer Competitive salary Flexible hybrid setup: office in Surat OR remote from Ahmedabad Collaborative culture with hackathons & tech talks paid leave To Apply: Email your resume, GitHub/repo links, and a brief cover note to hello@tpots.co Subject line: Laravel Developer—3+ yrs Job Type: Full-time Pay: ₹16,788.45 - ₹35,000.00 per month Location Type: In-person Schedule: Day shift Work Location: In person Speak with the employer +91 8511000586
Posted 5 days ago
0 years
0 Lacs
Noida
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consul tant, SFDC Developer (Experience Cloud) ! In this role you will be responsible for below: Responsibilities: Lead analysis, design, development, unit test and deployment on Salesforce platform as part of Agile cadence Build custom solutions on the platform using LWC, Aura, Apex, Visualforce Implement standard Salesforce functionality including sharing rules, roles, profiles, etc. and use standard features including Workflow, Process builder, and flows to create solutions. Execute integration testing with Apex test classes and maintain maximum code coverage. Developing trigger, batch class, and scheduled job by maintaining best coding standards Write Apex using industry standard best practices and design patterns. Maintain expert level knowledge of Salesforce system architecture and development best practices to scale implementations. Develop complex integrations between Salesforce and other systems, either through custom API or middleware tools. Work on production incident debugging, analysis, bug fixes, service requests, data loads, minor/major enhancements. Provide business support for critical issues and mentor operations team members. Qualifications we seek in you! Minimum qualifications / Skills BS/MS degree in Computer Science or related technical field involving coding or equivalent work/technical experience. Experience working on development of Experience cloud , Sales cloud and Service Cloud instances for large enterprise & multiple geography. Salesforce Platform Developer I & II Certification. Using declarative (Process builder, Flow and Workflow) versus programmatic methods and extending the Lightning Platform using Apex and Lightning web components . Experience using Apex Data loade Preferred qualifications / Skills Demonstrable experience designing, and personally building, Lightning pages for enhanced end user experiences. Experience with Salesforce sites and Communities Experience integrating Salesforce with external systems (REST & SOAP API, JSON & XML, etc.) Knowledge of Salesforce platform best practices, coding, design guidelines and governor limits Excellent communication, documentation, and organizational skills and the ability to relentlessly prioritize. Passion for a fast-paced, high growth environment Good to have experience in Conga : Conga composer for dynamic document generation, Contract life cycle, Contract approval, Electronic Signature by conga sign Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training . Job Lead Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 11, 2025, 9:02:27 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 5 days ago
1.0 years
0 Lacs
Lucknow
On-site
Job Summary We are looking for a skilled PHP Developer to join our development team. You will be responsible for designing and developing server-side web application logic, maintaining backend components, and integrating front-end elements built by your colleagues into the application. The ideal candidate should have strong experience in PHP frameworks, RESTful APIs, and database management. Key Responsibilities Write clean, well-structured, and efficient PHP code. Develop, test, and maintain web applications using PHP and related frameworks (e.g., Laravel, CodeIgniter, Symfony). Integrate user-facing elements developed by front-end developers with server-side logic. Build and consume RESTful APIs. Work with databases such as MySQL, PostgreSQL, or MongoDB. Perform code reviews and maintain version control using Git. Troubleshoot, test, and maintain the core product software to ensure strong optimization and functionality. Collaborate with cross-functional teams including designers, developers, and product managers. Stay updated on emerging technologies and apply them to operations and activities. Required Skills & Qualifications Proven experience as a PHP Developer (1–5+ years depending on level). Strong knowledge of PHP web frameworks (Laravel preferred). Familiarity with front-end technologies such as HTML, CSS, JavaScript, and AJAX. Good understanding of object-oriented programming (OOP). Experience with SQL/NoSQL databases and writing optimized queries. Knowledge of version control tools such as Git. Familiarity with Composer, Docker, or CI/CD pipelines is a plus. Ability to work independently or in a team environment. Preferred Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Experience working in an Agile/Scrum development process. Understanding of MVC design patterns. Knowledge of cloud services (AWS, Azure) is an advantage. Experience with CMS platforms like WordPress or Drupal is a plus. Soft Skills Strong analytical and problem-solving skills. Good communication and collaboration abilities. Time management and ability to handle multiple projects. Job Type: Full-time Pay: Up to ₹30,000.00 per month Location Type: In-person Work Location: In person Speak with the employer +91 9151010021
Posted 5 days ago
1.0 - 3.0 years
0 - 0 Lacs
Jaipur
On-site
Position overview-: We are looking for a talented PHP Laravel developer with minimum 1-3 years of experience in developing robust web applications using the Laravel framework. The candidate will be responsible for developing,enhancing and maintaining web applications ensuring high performance and responsiveness. Key Responsibilities-: Application Development: Develop,test and deploy web applications using Laravel. Database Management: Design and maintain database schemes and write optimized queries. API Development: Develop and maintain RESTful APIs for front-end and third party integrations. Debugging and Troubleshooting: Identify and fix bugs, and conduct code reviews. Maintenance and Optimization: Update and optimize existing web applications Collaboration and Documentation: Work with front-end developers, designers and document process clearly Required skills and Qualifications: Technical skills: Proficiency in PHP , Laravel ,front-end technologies (HTML,CSS,Javascript),SQL databases, and version control systems like Git. Development Tools: Experience with composer,npm and web server technologies. Soft Skills: Strong problem-solving communication , teamwork abilities and time management. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹25,000.00 per month Schedule: Day shift Work Location: In person
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
CloudWerx is looking for a dynamic SENIOR ENGINEER, DATA to become a vital part of our vibrant DATA ANALYTICS & ENGINEERING TEAM , working in HYDERABAD, INDIA . Join the energy and come be part of the momentum! As a Senior Cloud Data Engineer you will be at the forefront of cloud technology, architecting and implementing cutting-edge data solutions that drive business transformation. You'll have the opportunity to work with a diverse portfolio of clients, from innovative startups to industry leaders, solving complex data challenges using the latest GCP technologies. This role offers a unique blend of technical expertise and client interaction, allowing you to not only build sophisticated data systems but also to consult directly with clients, shaping their data strategies and seeing the real-world impact of your work. If you're passionate about pushing the boundaries of what's possible with cloud data engineering and want to be part of a team that's shaping the future of data-driven decision making, this is your chance to make a significant impact in a rapidly evolving field. Our goal is to have a sophisticated team equipped with expert technical skills in addition to keen business acumen. Each member of our team adds unique value to the business and the customer. CloudWerx is committed to a culture where we attract the best talent in the industry. We aim to be second-to-none when it comes to cloud consulting and business acceleration. This is an incredible opportunity to get involved in an engineering-focused cloud consulting company that provides the most elite technology resources to solve the toughest challenges. Each member of our team adds unique value to the business and the customer. CloudWerx is committed to a culture where we attract the best talent in the industry. We aim to be second-to-none when it comes to cloud consulting and business acceleration. This role is a full-time opportunity in our Hyderabad Office. INSIGHT ON YOUR IMPACT Lead technical discussions with clients, translating complex technical concepts into clear, actionable strategies that align with their business goals. Architect and implement innovative data solutions that transform our clients' businesses, enabling them to harness the full power of their data assets. Collaborate with cross-functional teams to design and optimize data pipelines that process petabytes of data, driving critical business decisions and insights. Mentor junior engineers and contribute to the growth of our data engineering practice, fostering a culture of continuous learning and innovation. Drive the adoption of cutting-edge GCP technologies, positioning our company and clients at the forefront of the cloud data revolution. Identify opportunities for process improvements and automation, increasing the efficiency and scalability of our consulting services. Collaborate with sales and pre-sales teams to scope complex data engineering projects, ensuring technical feasibility and alignment with client needs. YOUR QUALIFICATION, YOUR INFLUENCE To be successful in the role, you must possess the following skills Proven experience (typically 4-8 years) in data engineering, with a strong focus on Google Cloud Platform technologies. Deep expertise in GCP data services, particularly tools like BigQuery, Cloud Composer, Cloud SQL, and Dataflow, with the ability to architect complex data solutions. Strong proficiency in Python and SQL, with the ability to write efficient, scalable, and maintainable code. Demonstrated experience in data modeling, database performance tuning, and cloud migration projects. Excellent communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders. Proven ability to work directly with clients, understanding their business needs and translating them into technical solutions. Strong project management skills, including experience with Agile methodologies and tools like Jira. Ability to lead and mentor junior team members, fostering a culture of knowledge sharing and continuous improvement. Track record of staying current with emerging technologies and best practices in cloud data engineering. Experience working in a consulting or professional services environment, with the ability to manage multiple projects and priorities. Demonstrated problem-solving skills, with the ability to think creatively and innovatively to overcome technical challenges. Willingness to obtain relevant Google Cloud certifications if not already held. Ability to work collaboratively in a remote environment, with excellent time management and self-motivation skills. Cultural sensitivity and adaptability, with the ability to work effectively with diverse teams and clients across different time zones. Our Diversity and Inclusion Commitment At CloudWerx, we are dedicated to creating a workplace that values and celebrates diversity. We believe that a diverse and inclusive environment fosters innovation, collaboration, and mutual respect. We are committed to providing equal employment opportunities for all individuals, regardless of background, and actively promote diversity across all levels of our organization. We welcome all walks of life, as we are committed to building a team that embraces and mirrors a wide range of perspectives and identities. Join us in our journey toward a more inclusive and equitable workplace. Background Check Requirement All candidates for employment will be subject to pre-employment background screening for this position. All offers are contingent upon the successful completion of the background check. For additional information on the background check requirements and process, please reach out to us directly. Our Story CloudWerx is an engineering-focused cloud consulting firm born in Silicon Valley - in the heart of hyper-scale and innovative technology. In a cloud environment we help businesses looking to architect, migrate, optimize, secure or cut costs. Our team has unique experience working in some of the most complex cloud environments at scale and can help businesses accelerate with confidence. Show more Show less
Posted 5 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Senior Data Analyst – AdTech (Team Lead) Location: Hyderabad Experience Level: 4–6 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About The Role We are looking for a highly experienced and hands-on Senior Data Analyst (AdTech) to lead our analytics team. This role is ideal for someone with a strong background in log-level data handling , cross-platform data engineering , and a solid command of modern BI tools . You'll play a key role in building scalable data pipelines, leading analytics strategy, and mentoring a team of analysts. Key Responsibilities Lead and mentor a team of data analysts, ensuring quality delivery and technical upskilling. Design, develop, and maintain scalable ETL/ELT pipelines using GCP tools (BigQuery, Dataflow, Cloud Composer, Cloud Functions, Pub/Sub). Ingest and process log-level data from platforms like Google Ad Manager, Google Analytics (GA4/UA), DV360, and other advertising and marketing tech sources. Build and optimize data pipelines from diverse sources via APIs, cloud connectors, and third-party tools (e.g., Supermetrics, Fivetran, Stitch). Integrate and manage data across multiple cloud platforms and data warehouses such as BigQuery, Snowflake, DOMO, and AWS (Redshift, S3). Own the creation of data models, data marts, and analytical layers to support dashboards and deep-dive analyses. Build and maintain scalable, intuitive dashboards using Looker Studio, Tableau, Power BI, or Looker. Partner with engineering, product, revenue ops, and client teams to gather requirements and drive strategic insights from data. Ensure data governance, security, and quality standards are followed across the analytics ecosystem. Required Qualifications 4–6 years of experience in data analytics or data engineering roles, with at least 1–2 years in a leadership capacity. Deep expertise working with log-level AdTech data—Google Ad Manager, Google Analytics, GA4, programmatic delivery logs, and campaign-level data. Strong knowledge of SQL and Google BigQuery for large-scale data querying and transformation. Hands-on experience building data pipelines using GCP tools (Dataflow, Composer, Cloud Functions, Pub/Sub, Cloud Storage). Proven experience integrating data from various APIs and third-party connectors. Experience working with multiple data warehouses: Snowflake, DOMO, AWS Redshift, etc. Strong skills in data visualization tools: Looker Studio, Tableau, Power BI, or Looker. Excellent stakeholder communication and documentation skills. Preferred Qualifications Scripting experience in Python or JavaScript for automation and custom ETL development. Familiarity with version control (e.g., Git), CI/CD pipelines, and workflow orchestration. Exposure to privacy regulations and consent-based data handling in digital advertising (GDPR, CCPA). Experience working in agile environments and managing delivery timelines across multiple stakeholders. Show more Show less
Posted 5 days ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Analyst – AdTech (1+ Years Experience) Location: Hyderabad Experience Level: 2–3 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About The Role We are looking for a highly motivated and detail-oriented Data Analyst with 1+ years of experience to join our AdTech analytics team. In this role, you will be responsible for working with large-scale advertising and digital media datasets, building robust data pipelines, querying and transforming data using GCP tools, and delivering insights through visualization platforms like Looker Studio, Looker, Tableau etc Key Responsibilities Analyze AdTech data (e.g., ads.txt, programmatic delivery, campaign performance, revenue metrics) to support business decisions. Design, develop, and maintain scalable data pipelines using GCP-native tools (e.g., Cloud Functions, Dataflow, Composer). Write and optimize complex SQL queries in BigQuery for data extraction and transformation. Build and maintain dashboards and reports in Looker Studio to visualize KPIs and campaign performance. Collaborate with cross-functional teams including engineering, operations, product, and client teams to gather requirements and deliver analytics solutions. Monitor data integrity, identify anomalies, and work on data quality improvements. Provide actionable insights and recommendations based on data analysis and trends. Required Qualifications 1+ years of experience in a data analytics or business intelligence role. Hands-on experience with AdTech datasets and understanding of digital advertising concepts. Strong proficiency in SQL, particularly with Google BigQuery. Experience building and managing data pipelines using Google Cloud Platform (GCP) tools. Proficiency in Looker Studio Strong problem-solving skills and attention to detail. Excellent communication skills with the ability to explain technical topics to non-technical stakeholders. Preferred Qualifications Experience with additional visualization tools such as Tableau, Power BI, or Looker (BI). Exposure to data orchestration tools like Apache Airflow (via Cloud Composer). Familiarity with Python for scripting or automation. Understanding of cloud data architecture and AdTech integrations (e.g., DV360, Ad Manager, Google Ads). Show more Show less
Posted 5 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements and the technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. #AI&D You’ll need to have… Bachelor’s degree or one or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in any DBMS Experience in Shell scripting, Spark, Scala. Experience in GCP, Cloud Composer and BigQuery Even better if you have one or more of the following Two or more years of relevant experience. Any relevant Certification on ETL/ELT developer. Certification in GCP-Data Engineer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less
Posted 5 days ago
4.0 years
0 Lacs
Ghaziabad, Uttar Pradesh, India
On-site
Responsibilities As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development & solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: ● Design, develop, and support data pipelines and related data products and platforms. ● Design and build data extraction, loading, and transformation pipelines and data products across on- prem and cloud platforms. ● Perform application impact assessments, requirements reviews, and develop work estimates. ● Develop test strategies and site reliability engineering measures for data products and solutions. ● Participate in agile development and solution reviews. ● Mentor junior Data Engineers. ● Lead the resolution of critical operations issues, including post-implementation reviews. ● Perform technical data stewardship tasks, including metadata management, security, and privacy by design. ● Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies ● Demonstrate SQL and database proficiency in various data engineering tasks. ● Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. ● Develop Unix scripts to support various data operations. ● Model data to support business intelligence and analytics initiatives. ● Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. ● Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). Qualifications: ● Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. ● 4+ years of data engineering experience. ● 2 years of data solution architecture and design experience. ● GCP Certified Data Engineer (preferred). Interested candidates can send their resumes to riyanshi@etelligens.in Show more Show less
Posted 5 days ago
4.0 - 5.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
About Sun Pharma: Sun Pharmaceutical Industries Ltd. (Sun Pharma) is the fourth largest specialty generic pharmaceutical company in the world with global revenues of US$ 5.4 billion. Supported by 43 manufacturing facilities, we provide high-quality, affordable medicines, trusted by healthcare professionals and patients, to more than 100 countries across the globe. Job Summary EDMS Development and Configuration specialist will be responsible for the successful development, deployment, configuration, and ongoing support of EDMS 21.2. This role requires a deep understanding of EDMS LSQM workflows, strong technical skills, and the ability to work closely with cross-functional teams to ensure the EDMS meets the needs of the organization. Roles and Responsibilities • Assist in the development and maintenance of Documentum D2 LSQM application, including custom workflows and document management solutions. • Collaborate with senior developers to understand requirements and translate them into technical specifications. • Support the testing and debugging of Documentum applications to ensure high-quality output and performance. • Document development processes and maintain accurate technical documentation. • Solid understanding of content management principles and best practices, with experience in implementing Documentum solutions in enterprise environments. • Familiarity with Java, SQL, and web services integration for developing Documentum applications. • Expertise in Documentum platform and its components, including Documentum Content Server and Documentum Webtop. • Proficiency in using development tools such as Documentum Composer and Documentum Administrator. • Experience with version control systems (e.g., Git) and agile development methodologies. Qualifications and Preferences Qualifications: • Bachelor's degree in Information Technology, or a related field. • Minimum of 4-5 years of experience in EDMS LSQM configuration, preferably in a pharmaceutical or biotech environment. • Strong understanding of Category 1, Category 2 & 3 workflows. • Proficiency in Documentum LSQM software. • Ability to manage multiple tasks and projects simultaneously. • Strong analytical and problem-solving skills. • Excellent communication and interpersonal skills. Prefereed Qualifications: • Advanced degree in Information Technology or a related field. • Experience with database management and DQL. • Understanding of Documentum Content Server and its APIs. • Familiarity with Documentum DQL (Documentum Query Language). • Experience in Documentum development, including proficiency in Documentum Foundation Classes (DFC) and Documentum Query Language (DQL). • Basic knowledge of RESTful services and web development principles. Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on Sun Pharma Candidates selected after the screening rounds will be processed further by Sun Pharma Show more Show less
Posted 5 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary We are looking for an experienced Mac Application Packaging Engineer with 7+ years of hands-on experience in packaging, deployment, scripting, and JAMF Pro administration . The ideal candidate will manage MAC packaging solutions for BFSI applications and work closely with cross-functional teams. Key Responsibilities Develop, test, and maintain macOS application packages using JAMF Composer, Packages, Apple PackageMaker, or Suspicious Package. Deploy applications using JAMF Pro. Implement security protocols and compliance standards in MAC packaging workflows. Create and manage documentation for deployment, packaging, known issues, workarounds, and administrative procedures. Troubleshoot packaging issues and support end users and IT teams. Collaborate with teams to gather requirements and ensure application compatibility. Participate in Incident, Problem, and Change Management processes. Must-Have Skills 7+ years of experience in macOS application packaging Proficiency with JAMF Composer and JAMF Pro Hands-on with Shell scripting, PowerShell, and VBScript Familiarity with tools like Packages, Suspicious Package, Apple PackageMaker Strong troubleshooting and documentation skills Good To Have Knowledge of Munki, Cloudpaging, or Windows MSI packaging Certifications in Apple or JAMF Prior experience in BFSI domain is a plus Behavioral Skills Excellent communication and collaboration abilities Strong leadership and team management experience Proactive problem-solving approach Ability to train and coach junior engineers Educational Qualifications B.Tech / BE / BCA / MCA / M.Tech or equivalent technical degree Skills: jamf composer,apple packagemaker,jamf,packaging,suspicious package,application,application packaging,packages,macos application packaging,documentation,powershell,teams,bfsi,vbscript,troubleshooting,munki,pro,shell scripting,apple,mac,jamf pro Show more Show less
Posted 5 days ago
4.0 years
0 Lacs
India
Remote
GCP Data Engineer Remote Type: Fulltime Rate: Market Client -Telus Required Skills: ● 4+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. ● Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. ● Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. ● Work closely with analysts and business process owners to translate business requirements into technical solutions. ● Coding experience in scripting and languages (Python, SQL, PySpark). ● Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space ( BigQuery , Google Composer, Airflow, CloudSQL, PostgreSQL, Oracle, GCP Workflows , Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, Vertex AI). ● Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. ● Understanding CI/CD Processes using Pulumi, Github, Cloud Build, Cloud SDK, Docker Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
India
Remote
Location - Remote Requirements - Minimum 5 years’ experience in Salesforce. Conga developer having Conga Composer and Conga X-Author skillset Salesforce certification PD1, PD2 and Conga CLM certifications are must Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Surat, Gujarat, India
Remote
About Praella: We are a proud Great Place to Work certified organization. We strive for excellence, and we chase perfection for our merchants and team. We build relationships with our merchants that are not reflective of a vendor-like or even a partner-like relationship. We strive to become an extension of who our merchants are. And we strive to become a reflection of our team as an organization. We are also a Webby-winning agency. We are a Shopify Plus partner. We are grateful to be an extension of some of the best e-commerce brands. We are a merchant-first, results-driven team. We have the nothing is impossible mentality. We work together and support each other and our clients. Collaboration and camaraderie are everything. We are data-driven, ambitious, and creative - we work hard, and we work smart. - Our founders started one of the first Shopify Plus agencies, which was eventually sold. - We are Shopify Plus Partners and partner with other e-commerce leaders like ReCharge, Klaviyo, Omnisend, Yotpo, Smile, etc. - We have a remote team, but our headquarters is in Chicago. We have a small team in Chicago. Outside of Chicago, we have teams located in Atlanta, Los Angeles, Phoenix, Toronto, Athens (Greece), Sarajevo (Bosnia), and Surat (India). - Do you want to work from Europe or India for a month and travel to nearby destinations on long weekends? Why not? - The majority of our clients are e-commerce-based merchants with annual revenue between $2M-$350MM. We are ambitious. And, we want you to be too. We need people that want to be pushed and who want to be challenged. We want people who will push us and who will challenge us. Is that you? Our Website: https://praella.com/ Job Description of Full Stack Developer Praella is seeking skilled Full Stack Developers to join our dynamic team, driving innovation and setting new standards for user experiences. We value a unique blend of technical expertise, insatiable curiosity, and a methodical, analytical mindset in our ideal candidates. Objectives of this Role: Regularly communicate progress on the long-term technology roadmap with stakeholders, project managers, quality assurance teams, and fellow developers. Create and maintain workflows to ensure workload balance and consistent visual designs. Develop and oversee testing schedules in the client-server environment, optimizing content display across various devices. Produce high-quality, test-driven, and modular code, setting a benchmark for the entire team. Recommend system solutions by evaluating custom development and purchase alternatives. About the Role: Write clean, secure, and modular PHP and Node.js code, with a focus on object-oriented programming, security, refactoring, and design patterns. Leverage expertise in the Laravel framework, building factories, facades, and libraries using abstract classes, interfaces, and traits. Conduct unit testing using frameworks like PHPUnit/phpspec. Demonstrate proficiency in RDBMS (MySQL/PostgreSQL), NoSQL databases (MongoDB/DynamoDB), and query optimization techniques. Utilize core knowledge of HTML5, CSS3, jQuery, and Bootstrap. Familiarity with JavaScript Frameworks (ReactJS/VueJS) is advantageous. Develop RESTful APIs, including Auth2.0 implementation for authentication and authorization. Experience in microservices development is a plus. Proficient in Git, with a clear understanding of Git workflows, BitBucket, and CI/CD processes. Familiarity with cloud servers (Heroku/Digital Ocean), Docker/Homestead, and server administration (Apache/Nginx, php-fpm). Create composer packages and work with webpack, gulp.js, Babel for browser support. Strong problem-solving and analytical skills. Excellent written and verbal communication skills in English. Additional Skills: Proficiency in Node.js. Experience with the Shopify E-commerce platform would be a valuable additional skill set. Qualifications: Demonstrable experience with PHP, Laravel, Node.js, and relevant frameworks. Experience with RDBMS and NoSQL databases. Proficiency in front-end technologies such as HTML5, CSS3, jQuery, and Bootstrap. Strong understanding of RESTful API design and development. Working knowledge of Git, CI/CD processes, and cloud servers. Familiarity with Docker, composer packages, and build tools. What you can bring to the table: Join Praella and be a part of a team shaping the future of user experiences. Your expertise will play a key role in our continued success and client satisfaction. Passion for learning and adapting to new technologies. Strong problem-solving skills and analytical mindset. Excellent written and verbal communication skills in English. Experience: 5+ Years of relevant industry experience. Education: B.E/B.Tech/B.Sc [(C.S.E)/I.T], M.C.A, M.Sc (I.T) Life At Praella Private Limited Benefits and Perks 5 days working Fully Paid Basic Life/ Competitive salary Vibrant Workplace PTO/Paid Offs/Annual Paid Leaves/Paternal Leaves Fully Paid Health Insurance. Quarterly Incentives Rewards & Recognitions Team Outings Our Cultural Attributes Growth mindset People come first Customer obsessed Diverse & inclusive Exceptional quality Push the envelope Learn and grow Equal opportunity to grow. Ownership Transparency Team Work. Together, we can…!!!!! Show more Show less
Posted 5 days ago
4.0 years
0 Lacs
India
Remote
Oracle Fusion OIC (VBCS + Reports Exp added an advantage) 4 - 9 years of IT experience relevant to this position. Experience in Development of Integrations using OIC for the Oracle Fusion Modules like Financials / SCM Should have excellent skills in Webservice technologies such as XML / XPATH / XSLT / SOAP / WSDL / XSD / JSON and REST Technologies Must have implemented integrations using web service and technology adapters like FTP / File / SOAP / REST / DB /JMS and AQ Experience in designing solutions in UI (VBCS) and Integration (OIC) space. Experience in developing SaaS Extensions using VBCS, OIC & ORDS. Understanding of Inherent tools and technologies of SaaS Applications (FBDI, BIP, ADFDI, Applications Composer, Page Integration, etc.) Expertise in Oracle Visual Builder Studio, Strong experience with Build and Release, Systems Integration. Knowledge in configuring SSO PaaS extensions with Fusion SaaS. Good understanding and usage of OCI architecture, functions, API Gateway, object storage is an added advantage. Good to have BPM skills. Have experience of building at least one project from scratch. Experience in performance tuning / testing and diagnosis of OIC Integrations Design and Develop integrations in OIC to Oracle ERP Cloud including extracting Cloud data using BI Publisher Reports. Hands on Experience on Encryption and Decryption in FTP Good hands-on experience in monitoring and debugging of OIC integration and migration of OIC components. Hands-on experience in data migration/integration methods SOAP and Rest web services, FBDI and ADFDI Excellent client interfacing skills / working with IT and as well as business stakeholders and writing technical design documents. Ability to leverage pre-built integrations, cloud adapters, on-premises adapters, connections, SaaS applications etc. in the solution Added advantage of having good understanding of Visual Builder Cloud Service and Process Cloud Service Added advantage of OCI knowledge. Experience in Development of Oracle fusion BIP / OTBI / EText Reports Experience in Development of ESS Jobs for the modules of Financials / SCM Location: Hyderabad / Gurgaon / Remote Show more Show less
Posted 5 days ago
0.0 years
0 Lacs
Adajan, Surat, Gujarat
Remote
Location: Work From Office: Pal, Surat Work From Home: Ahmedabad, Gujarat About Us TPots (Shingala Digital Solutions) builds robust web applications and APIs for clients across industries. Join our team to craft scalable Laravel-powered backends and deliver top-notch solutions. Key Responsibilities Design, develop & maintain Laravel-based web applications and RESTful APIs Architect database schemas and optimize complex Eloquent queries Implement authentication, authorization, and role-based access control Integrate third-party services (payment gateways, CRMs, messaging APIs) Write clean, secure, testable code with PHPUnit and Laravel Dusk Troubleshoot bugs, performance bottlenecks, and security vulnerabilities Collaborate with Front-end, DevOps & QA in Agile sprints Participate in code reviews, mentoring junior developers Must-Have Qualifications 3+ years hands-on Laravel development (Laravel 6.x–10.x) Proficiency in PHP 7.4+, Composer, and modern PHP practices Strong MySQL/PostgreSQL skills, including indexing & query optimization Experience with API design (REST, JWT/OAuth2) and JSON serialization Familiarity with queues (Redis/RabbitMQ), task scheduling, and caching Solid understanding of MVC architecture, SOLID principles, and design patterns Version control with Git and collaborative branching workflows Good communication skills in English Nice-to-Have Experience with Livewire, Inertia.js or Vue.js integration Knowledge of Docker, CI/CD pipelines (GitHub Actions, GitLab CI) Familiarity with AWS services (RDS, S3, Lambda) or DigitalOcean Exposure to automated testing tools and code-quality linters Prior work on multi-tenant or SaaS platforms What We Offer Competitive salary Flexible hybrid setup: office in Surat OR remote from Ahmedabad Collaborative culture with hackathons & tech talks paid leave To Apply: Email your resume, GitHub/repo links, and a brief cover note to hello@tpots.co Subject line: Laravel Developer—3+ yrs Job Type: Full-time Pay: ₹16,788.45 - ₹35,000.00 per month Location Type: In-person Schedule: Day shift Work Location: In person Speak with the employer +91 8511000586
Posted 5 days ago
45.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a highly experienced Senior Data Engineer with strong expertise in Google Cloud Platform (GCP) and exposure to machine learning engineering (MLE) to support high-impact banking initiatives. The ideal candidate will combine hands-on engineering skills, architectural insight, and a proven track record of building secure, scalable, and intelligent data solutions in financial services. Location: Pune, India (Work from Office - Completely onsite) Experience Required: Minimum 45years Position Type: Full-time Start Date: Immediate or as per notice period Key Responsibilities : Provide technical leadership on GCP data projects, collaborating with business, data science, and ML teams. Design and implement scalable data pipelines using GCP tools (BigQuery, Dataflow, Composer, etc.). Support MLOps workflows, including feature engineering and real-time inference. Ensure secure, compliant, and high-quality data infrastructure aligned with banking standards. Optimize BigQuery performance and cost efficiency for large-scale datasets. Enable BI insights using tools like Power BI and Looker. Own the end-to-end data lifecycle across development, deployment, and monitoring. Required Skills: 6–10 years of experience in data engineering; 3+ on GCP. Deep proficiency in GCP services (BigQuery, Dataflow, Composer, Dataproc). Strong Python and SQL skills; familiarity with Terraform and CI/CD tools. Experience supporting ML pipelines and maintaining compliance in regulated environments. Preferred: GCP certifications (Professional Data Engineer / Architect). Familiarity with MLOps (Vertex AI, Kubeflow), financial data domains, and streaming data. Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Consultant Delivery ( Data Engineer) About Worldline At Worldline, we are pioneers in payments technology, committed to creating innovative solutions that make financial transactions secure, accessible, and seamless worldwide. Our diverse team of professionals collaborates across cultures and disciplines, driving progress that benefits society and businesses of all sizes. We believe that diverse perspectives fuel innovation and are dedicated to fostering an inclusive environment where all individuals can thrive. The Opportunity We are seeking a highly skilled and knowledgeable Data Engineer to join our Data Management team on a transformative Move to Cloud (M2C) project. This role offers a unique opportunity to contribute to a critical initiative, migrating our data infrastructure to the cloud and optimizing our data pipelines for performance and scalability. We welcome applicants from all backgrounds and experiences, believing that our strength lies in our diversity. Technical Skills & Qualifications Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Experience: Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based solutions, preferably within the Google Cloud Platform (GCP) ecosystem. Essential Skills: Strong knowledge of version control systems and CI/CD pipelines. Proficiency in GCP services, particularly DataProc, Dataflow, Cloud Functions, Workflows, Cloud Composer, and BigQuery. Extensive experience with ETL tools, specifically dbt Labs, and a deep understanding of ETL best practices. Proven ability to build and optimize data pipelines, architectures, and datasets from both structured and unstructured data sources. Proficiency in SQL and Python, with experience using Spark. Excellent analytical and problem-solving skills, with the ability to translate complex requirements into technical solutions. Desirable Skills: Relevant certifications in Google Cloud Platform or other data engineering credentials. Preferred Skills Experience migrating data from on-premises data warehouses (e.g., Oracle) to cloud-based solutions. Experience working with large-scale datasets and complex data transformations. Strong communication and interpersonal skills, with the ability to collaborate effectively within a team environment. Why Join Us? At Worldline, we believe that embracing diversity and promoting inclusion drives innovation and success. We foster a workplace where everyone feels valued and empowered to bring their authentic selves. We offer extensive training, mentorship, and development programs to support your growth and help you make a meaningful impact. Join a global team of passionate professionals shaping the future of payments technology—where your ideas, experiences, and perspectives are appreciated and celebrated. Learn more about life at Worldline at Jobs.worldline.com. We are an Equal Opportunity Employer. We do not discriminate based on race, ethnicity, religion, color, national origin, sex (including pregnancy and childbirth), sexual orientation, gender identity or expression, age, disability, or any other legally protected characteristic. We are committed to creating a diverse and inclusive environment for all employees Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 6 days ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 2+ years of software engineering experience 2+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 2+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 2+ years experience designing and developing microservices using Java, Spring Framework, GCP SDKs, GKE/Kubernetes 2+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data Technologies : Spark/Scala/Hadoop What could set you apart Experience designing and developing big data processing solutions using DataProc, Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Cloud Certification especially in GCP Self-starter that identifies/responds to priority shifts with minimal supervision. You have excellent leadership and motivational skills You have an inquisitive and innovative mindset with a shown ability to recognize opportunities to create distinctive value You can successfully evaluate workload to drive efficiency Show more Show less
Posted 6 days ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a skilled professional with o ver 3-5 years of experience, Google Cloud Platform (GCP) services , including Dataproc and Composer. Note:- We are currently hiring candidates who are available to join immediately Key Responsibilities:- Design, configure, and manage GCP infrastructure services like VPC, Compute Engine, Load Balancers, IAM, CloudSQL, BigQuery, and Cloud Storage Automate infrastructure provisioning using Terraform and Ansible Implement CI/CD pipelines using Jenkins / CloudBees Manage logging and monitoring with Stackdriver (Cloud Monitoring & Logging) Write scripts in Bash and PowerShell to automate operations on Linux/Windows environments Install and configure Looker for business intelligence workflows Ensure GCP DevOps services such as Cloud Build and Cloud Code are leveraged efficiently Mandatory Skills:- GCP Build, Cloud Code, GCP DevOps Services Ansible & Terraform scripting Strong knowledge of Linux/Windows system administration Experience with Looker installation and configuration Good to Have:- Experience with Jenkins/CloudBee Familiarity with monitoring & alerting best practices GCP certifications preferred Show more Show less
Posted 6 days ago
20.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Company Overview iFlair Web Technologies Pvt. Ltd. is a premier software development company with expertise in web, mobile, and e-commerce technologies. With over 20 years of industry experience, we have successfully delivered 3500+ projects to global clients. Our solutions are tailored to meet business needs and enhance customer experiences. (https://www.iflair.com) Job Summary We are looking for a highly skilled and experienced Technical Team Leader with deep expertise in Laravel and PHP frameworks. The ideal candidate will have a minimum of 5 years of experience in web development and will be responsible for leading a team of developers, ensuring code quality, and driving the technical success of projects. Experience or exposure to mobile application development will be an added advantage. Key Responsibilities · Lead and manage a team of Laravel/PHP developers to deliver high-quality projects. · Design and develop scalable web applications using the Laravel framework. · Architect robust, secure, and scalable PHP-based applications. · Conduct code reviews, mentor team members, and enforce best practices in development. · Collaborate with project managers, designers, and other teams to ensure smooth project delivery. · Manage project timelines, risks, and resource planning. · Troubleshoot and debug complex technical issues. · Stay up-to-date with Laravel and PHP trends, tools, and practices. · Ensure documentation and technical specifications are maintained. · Make informed decisions in the best interest of project execution and work independently when needed. Required Qualifications · Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. · Minimum 5 years of experience in PHP and Laravel development. · Strong understanding of OOP principles, MVC architecture, and RESTful API development. · Proficient in MySQL, HTML, CSS, JavaScript, and modern front-end frameworks (e.g., Vue.js or React). · Experience in using Git, Composer, and other development tools. · Excellent problem-solving, debugging, and analytical skills. · Strong leadership and communication abilities. Show more Show less
Posted 6 days ago
5.0 years
6 - 7 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role you will be Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP. Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs. Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows. Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks. Data Integration and Management: Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency. Manage and maintain data storage solutions in GCP (e.g., BigQuery, Cloud Storage) to support analytics and reporting. GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy. Collaborate with data analysts and data scientists to prepare data for analysis and reporting. Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention. Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs. Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies. Collaborate with security teams to implement data protection measures and address vulnerabilities. Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members. Conduct training sessions and workshops to share expertise and promote best practices within the team. Requirements To be successful in this role, you should meet the following requirements: Education: Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP. Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development. Technical Skills: Strong knowledge of GCP services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering. Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred. Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc. Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform. Analytical Skills:Strong problem-solving skills with a keen attention to detail. Ability to analyze complex data sets and derive meaningful insights. Benefits:Competitive salary and comprehensive benefits package. Opportunity to work in a dynamic and collaborative environment on cutting-edge data projects. Professional development opportunities to enhance your skills and advance your career. If you are a passionate data engineer with expertise in ETL processes and a desire to make a significant impact within our organization, we encourage you to apply for this exciting opportunity! You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2