AWS Data Architect
Role: T0
The ideal professional for this AWS Architect role will:
Have a passion for design, technology, analysis, collaboration, agility, and planning, along with a drive for continuous improvement and innovation.
Exhibit expertise in managing high-volume data projects that leverage Cloud Platforms, Data Warehouse reporting and BI Tools, and the development of relational databases.
Research, identify, and internally market enabling data management technologies based on business and end-user requirements.
Seek ways to apply new technology to business processes with a focus on modernizing the approach to data management.
Consult with technical subject matter experts and develop alternative technical solutions. Advise on options, risks, costs versus benefits, and impact on other business processes and system priorities.
Demonstrate strong technical leadership skills and the ability to mentor others in related technologies.
Qualifications
Bachelor's degree in a computer-related field or equivalent professional experience is required.
Preferred master’s degree in computer science, information systems or related discipline, or equivalent and extensive related project experience.
10+ years of hands-on software development experience building data platforms with tools and technologies such as Hadoop, Cloudera, Spark, Kafka, Relational SQL, NoSQL databases, and data pipeline/workflow management tools.
6+ years of experience working with various AWS cloud platforms.
Experience in AWS data platform migration and hands-on experience working with AWS and its tools
Experience in Data & Analytics projects is a must.
Data modeling experience – relational and dimensional with consumption requirements (reporting, dashboarding, and analytics).
Thorough understanding and application of AWS services related to Cloud data platform and Datalake implementation – S3 Datalake, AWS EMR, AWS Glue, Amazon Redshift, AWS Lambda, and Step functions with file formats such as Parquet, Avro, and Iceberg.
Must know the key tenets of architecting and designing solutions on AWS Clouds.
Expertise and implementation experience in data-specific areas, such as AWS Data Lake, Data Lakehouse Architecture, and SQL Datawarehouse.
Apply technical knowledge to architect and design solutions that meet business and IT needs, create Data & Analytics roadmaps, drive POCs and MVPs, and ensure the long-term technical viability of new deployments, infusing key Data & Analytics technologies where applicable.
Be the Voice of the Customer to share insights and best practices, connect with the Engineering team to remove key blockers, and drive migration solutions and implementations.
Familiarity with tools like DBT, Airflow, and data test automation.
MUST have experience with Python/PySpark/Scala in Big Data environments.
Strong skills in SQL queries in Big Data tools such as Hive, Impala, Presto.
Experience working with and extracting value from large, disconnected, and/or unstructured datasets.