Company Description
Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region.
Job Description
Key Responsibilities
- Data Architecture Design
- Define and design Big Data architecture solutions, including data lakes, data warehouses, and real-time processing systems.
- Architect and implement scalable, secure, and high-performance data pipelines and data integration solutions.
- Ensure alignment with industry best practices and organizational goals for data architecture.
- Big Data Ecosystem Management
- Develop and manage workflows using Big Data tools like Hadoop, Spark, Kafka, Hive, and Flink.
- Leverage cloud-based Big Data services (AWS EMR, Azure Synapse, GCP BigQuery, or similar) to optimize performance and scalability.
- Oversee the implementation of streaming data platforms to support real-time analytics.
- Data Modeling and Integration
- Design and maintain data models (conceptual, logical, and physical) that support structured and unstructured data.
- Build robust ETL/ELT processes to ingest, process, and integrate large volumes of diverse data sources.
- Implement APIs and frameworks for seamless data sharing and consumption.
- Data Governance and Security
- Establish frameworks to ensure data quality, lineage, and governance across the data lifecycle.
- Implement security measures for data at rest and in motion using encryption and access controls.
- Ensure compliance with global data regulations such as GDPR, CCPA, or similar.
- Gen AI exposure / experience as mandatory.
- Collaboration and Stakeholder Engagement
- Partner with data engineers, data scientists, business analysts, and IT teams to align architecture with business needs.
- Translate complex technical concepts into actionable insights for stakeholders.
- Performance Optimization and Monitoring
- Monitor and optimize performance of Big Data systems, ensuring low latency and high reliability.
- Troubleshoot and resolve performance bottlenecks in distributed data environments.
- Emerging Technology and Innovation
- Evaluate and implement emerging technologies, such as Graph Databases, NoSQL Systems, and AI-driven analytics platforms.
- Continuously explore innovations in the Big Data ecosystem to drive efficiency and competitive advantage.
Success Criteria
- Explore different tech stacks and architecture design
- Document supporting evidence with KPIs for decision on solution design and document product guideline and protocols to seamlessly utilize the framework
- Prior experience with ETL & Big Data to set up scalable pipeline to process data in real time and batch
- Develop configurable solutions to support cross functional requirements and support multiple platforms.
- Experience in managing cross-functional team, requirements gathering , day-to-day relationships with clients, and stakeholders supporting clients to achieve better outcomes
Preferred Qualifications
- Experience:
- 10+ years of experience in data architecture, with at least 3+ years focusing on Big Data technologies.
- 5+ years as Data Architect with proficiency working in environments supporting solutions design
- Proven track record of delivering end-to-end Big Data solutions in enterprise environments.
- Technical Expertise:
- Strong understanding of Big Data frameworks like Hadoop, Spark, Kafka, Hive, Flink, and Presto.
- Proficiency in cloud-based Big Data platforms (AWS EMR, Azure Synapse, GCP BigQuery, or Databricks).
- Expertise in database systems, including both SQL (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra).
- Hands-on experience with ETL tools like Talend, Informatica, or Apache NiFi.
- Familiarity with data visualization tools (e.g., Tableau, Power BI) and analytics platforms.
- Certifications:
- Certifications such as AWS Certified Data Analytics, Azure Data Engineer Associate, GCP Professional Data Engineer, or Hadoop certifications are highly desirable.
Key Attributes
- Strong analytical and problem-solving skills with a passion for data-driven innovation.
- Excellent communication and collaboration abilities to engage both technical and non-technical stakeholders.
- Strategic mindset with a focus on scalability, performance, and alignment with business objectives.
- Ability to thrive in fast-paced environments and handle multiple priorities effectively.
Qualifications
BE,BTech, MTech
Additional Information
10 - 16
नौकरी रिपोर्ट करें