Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Job Summary Synechron is seeking a motivated and skilled Data Engineer specializing in Google Cloud Platform (GCP) to join our innovative team. This role is integral to designing, implementing, and managing scalable and secure cloud-based data solutions that drive our business objectives forward. The Data Engineer will play a key role in ensuring optimal performance and security of cloud solutions, working collaboratively with clients and internal stakeholders. Software Requirements Required Software Skills: Google Cloud Platform (GCP)Proficiency in core services and tools. AWS or AzureBasic understanding and experience. VirtualizationExperience with virtualization technologies. Networking and SecurityStrong skills in cloud networking and security protocols. Preferred Software Skills: Familiarity with other cloud platforms such as AWS or Azure beyond basic use. Experience with data management tools and libraries. Overall Responsibilities Design, implement, and manage cloud-based data solutions tailored to client needs. Ensure solutions are secure, scalable, and optimized for performance. Collaborate with clients and stakeholders to identify, troubleshoot, and resolve technical issues. Participate in project planning and management, contributing to timelines and resource allocation. Enhance industry knowledge and best practices within Synechron. Technical Skills (By Category) Programming Languages: RequiredProficiency in programming or scripting languages commonly used in cloud environments (e.g., Python, SQL). Databases/Data Management: EssentialExperience with cloud-based data storage solutions. Cloud Technologies: EssentialGoogle Cloud Platform expertise. PreferredAWS and Azure familiarity. Frameworks and Libraries: PreferredKnowledge of data processing frameworks like Apache Beam or Kafka. Development Tools and Methodologies: RequiredAgile development experience. Security Protocols: EssentialUnderstanding of cloud security best practices. Experience Requirements 2-3 years of experience in a similar role or related field. Relevant experience in cloud computing, IT infrastructure, or related fields. Hands-on experience with GCP, and familiarity with AWS or Azure is a plus. Day-to-Day Activities Collaborate with team members and clients to develop cloud-based solutions. Conduct regular meetings and updates to track progress and address challenges. Deliver high-quality data solutions and ensure project milestones are met. Exercise decision-making authority in technical design and implementation. Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Relevant certifications in cloud technologies are preferred. Commitment to continuous professional development and staying updated on industry trends. Professional Competencies Strong critical thinking and problem-solving capabilities. Ability to work collaboratively within a team and lead where necessary. Excellent written and verbal communication skills for effective stakeholder management. Adaptability to evolving technologies and learning new tools. Innovative mindset with a focus on optimization and efficiency. Effective time and priority management skills.
Posted 2 months ago
5.0 - 10.0 years
8 - 17 Lacs
Dubai, Hyderabad, Chennai
Work from Office
Require candidate with Tekla Detailer with experience in Steel Industry candidate must have done AUTOCAD, 3D Modelling, REVIT & TEKLA STRUCTURE
Posted 2 months ago
1.0 - 3.0 years
3 - 5 Lacs
Mumbai
Work from Office
Project Execution: Study the Structural & architectural dwgs of assigned projects with the respective DBR to have an understanding of the GA layouts & the project in total Prepare various loading diagrams and loading sheets, either to assist the design team or his assigned project. Modelling of structures for the assigned project with the help of software as per his seniors instruction/DBR. To get the design counter sign by his/her Seniors. To comply with implementation of Document controller(Design) for the assigned project or as instructed by the Seniors. Appropriate Filing & documentation of the assigned work/project. To Co-ordinate with various agencies as and when instructed by the Team Leader. To complete the work in the stipulated time frame & if any extension required, it has to be discussed or brought to Team Leader s notice before starting the work. To do site visit of the projects which he has dealt along with Company Engineers. Optimization of Design: Detailing & drawing checking of following structural elements using office spread sheets. Slab ( Conventional one way,two way,Cantilever) Flat slab. Beam & columns/Shearwalls. Foundation/raft/pile/pile cap. Retaining wall. Basic Exercise/Comparative Study to do optimization on the basis of estimation done by them. Documentation: To prepare design document of the work and get it countersigned with his senior.
Posted 2 months ago
1.0 - 4.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Job Area: Information Technology Group, Information Technology Group > IT Data Engineer General Summary: We are looking for a savvy Data Engineer expert to join our analytics team. The Candidate will be responsible for expanding and optimizing our data and data pipelines, as well as optimizing data flow and collection for cross functional teams. The ideal candidate has python development experience and is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. We believe that candidate with solid Software Engineering/Development is a great fit. However, we also recognize that each candidate has a unique blend of skills. The Data Engineer will work with database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams. The right candidate will be excited by the prospect of optimizing data to support our next generation of products and data initiatives.Responsibilities for Data Engineer Create and maintain optimal data pipelines, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing for greater scalability, etc. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Work with data and analytics experts to strive for greater functionality in our data systems. Performing ad hoc analysis and report QA testing. Follow Agile/SCRUM development methodologies within Analytics projects. Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Good communication skills, a great team player and someone who has the hunger to learn newer ways of problem solving. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. A successful history of manipulating, processing, and extracting value from large, disconnected datasets. Working knowledge on Unix or Shell scripting Constructing methods to test user acceptance and usage of data. Knowledge of predictive analytics tools and problem solving using statistical methods is a plus. Experience supporting and working with cross-functional teams in a dynamic environment. Demonstrated understanding of the Software Development Life Cycle Ability to work independently and with a team in a diverse, fast paced, and collaborative environment Excellent written and verbal communication skills A quick learner with the ability to handle development tasks with minimum or no supervision Ability to multitask We are looking for a candidate with 7+ years of experience in a Data Engineering role. They should also have experience using the following software/tools Experience in Python, Java, etc. Experience with Google Cloud Platform. Experience with bigdata frameworks & tools - Apache Hadoop/Beam/Spark/Kafka. Exposure to workflow management & scheduling using Airflow/Prefect/Dagster Exposure to databases like (Big Query , Clickhouse). Experience to container orchestration (Kubernetes) Optional Experience on one or more BI tools (Tableau, Splunk or equivalent).. Minimum Qualifications:6+ years of IT-related work experience without a Bachelors degree. 2+ years of work experience with programming (e.g., Java, Python). 1+ year of work experience with SQL or NoSQL Databases. 1+ year of work experience with Data Structures and algorithms.'Bachelor's degree and 7+ years Data Engineer/ Software Engineer (Data) Experience Minimum Qualifications: 4+ years of IT-related work experience with a Bachelor's degree in Computer Engineering, Computer Science, Information Systems or a related field. OR 6+ years of IT-related work experience without a Bachelors degree. 2+ years of work experience with programming (e.g., Java, Python). 1+ year of work experience with SQL or NoSQL Databases. 1+ year of work experience with Data Structures and algorithms. Bachelors / Masters or equivalent degree in computer engineering or in equivalent stream Applicants Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.
Posted 2 months ago
9 - 11 years
37 - 40 Lacs
Ahmedabad, Bengaluru, Mumbai (All Areas)
Work from Office
Dear Candidate, We are hiring a Scala Developer to work on high-performance distributed systems, leveraging the power of functional and object-oriented paradigms. This role is perfect for engineers passionate about clean code, concurrency, and big data pipelines. Key Responsibilities: Build scalable backend services using Scala and the Play or Akka frameworks . Write concurrent and reactive code for high-throughput applications . Integrate with Kafka, Spark, or Hadoop for data processing. Ensure code quality through unit tests and property-based testing . Work with microservices, APIs, and cloud-native deployments. Required Skills & Qualifications: Proficient in Scala , with a strong grasp of functional programming Experience with Akka, Play, or Cats Familiarity with Big Data tools and RESTful API development Bonus: Experience with ZIO, Monix, or Slick Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 2 months ago
7 - 10 years
16 - 21 Lacs
Mumbai
Work from Office
Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.
Posted 2 months ago
1 - 5 years
6 - 11 Lacs
Pune
Work from Office
About The Role : Job Title Data Engineer for Private Bank One Data Platform on Google Cloud Corporate TitleAssociate LocationPune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
10 - 16 years
35 - 40 Lacs
Pune
Work from Office
About The Role : Job TitleLead Engineer, VP LocationPune, India Role Description A Passion to Perform. Its what drives us. More than a claim, this describes the way we do business. Were committed to being the best financial services provider in the world, balancing passion with precision to deliver superior solutions for our clients. This is made possible by our peopleagile minds, able to see beyond the obvious and act effectively in an ever-changing global business landscape. As youll discover, our culture supports this. Diverse, international and shaped by a variety of different perspectives, were driven by a shared sense of purpose. At every level agile thinking is nurtured. And at every level agile mind are rewarded with competitive pay, support and opportunities to excel What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Designing, implementing and operationalising Java based software components for the Transaction Monitoring Data Controls applications Contributing to DevOps capabilities to ensure maximum automation of our applications. Leveraging best practices - Build Data Driven Decisions Collaborationacross the TDI areas such as Cloud Platform, Security, Data, Risk&Compliance areasto create optimum solutions for the business, increasing re-use, creating best practice and sharing knowledge. Your skills and experience 13+ years of hands-on experience of Java development (Java 11+) in either of: Spring Boot/Microservices/APIs/Transactional databases Java data processing frameworks such as Apache Spark, Apache Beam, Flink Experience of contributing to software design and architecture including consideration of meeting non-functional requirements (e.g., reliability, scalability, observability, testability) Understanding of relevant Architecture styles and their trade-offs - e.g., Microservices, Monolith, Batch. Professional experience inbuilding applications into one of the cloud platforms (Azure, AWS or GCP)and usage of their major infra components (Software Defined Networks, IAM, Compute, Storage, etc.) Professional experience of at least one data storage technology (e.g., Oracle, Big Query) Experience designing and implementing distributed enterprise applications Professional experience of at least one "CI/CD" tool such as Team City, Jenkins, GitHub Actions Professional experience of Agile build and deployment practices (DevOps) Professional experience of defining interface and internal data models both logical and physical Experience of working with a globally distributed team requiring remote interaction across locations, time zones and diverse cultures Excellent communication skills (verbal and written) Idealto Have Professional experience working with Java components on GCP (e.g. App Engine, GKE, Cloud Run) Professional experience working with RedHat OpenShift & Apache Spark Professional experience working with Kotlin Experience of working in one or more large data integration projects/products Experience and knowledge of Data Engineering topics such as partitioning, optimisation based on different goals (e.g. retrieval performance vs insert performance) A passion for problem solving with strong analytical capabilities. Experience related to any of payment scanning, fraud checking, integrity monitoring, payment lifecycle management Experience working with Drools or similar product Data modelling experience Understanding of data security principle, data masking s and implementation considerations Education/Qualifications Degree from an accredited college or university with a concentration in Engineering or Computer Science How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough