Big Data Architect, Requirements and Responsibilities

Big Data Architect, Requirements and Responsibilities to get the Job


  • Lead strategic consulting engagements for Perficient that quickly deliver high-value solutions for our clients.  This involves full lifecycle delivery of advanced business intelligence and data warehousing systems.
  • Represent Perficient as the primary on-site technical contact at client sites.
  • Be a subject matter expert in data warehouse appliances to include Big Data and Analytical solutions.
  • Provide technical and managerial leadership in a team that designs and develops path breaking large-scale cluster data processing systems
  • Assisting Perficient’s  customers develop strategies that maximize the value of their data
  • Establish Perficient’s thought leadership in the OSS and Appliances big data space by contributing white-papers, blog entries, and perspectives.

To succeed in this role you should:

  • Have broad set of technology skills to be able to design and build robust solutions for analyze big data
  • Ability to learn quickly as the industry grows
  • Understand business drivers and impacts
  • Be able to grasp the problem at hand and recognize appropriate approach, tools and technologies to solve it
  • Ability to be objective in evaluating technical solutions
  • Have a process driven mindset.
  • Leverage social media to establish thought-leadership


  • BS, MS in Computer Science, Math, or relevant Business Administration specialty
  • 7+ Years dealing with leading edge business intelligence and data warehousing systems.
  • Balance of industry and consulting experience.
  • In depth understanding of different architected approaches to managing and querying  large data sets
  • Experience leading data warehouse and/or big data platform selection projects
  • Hands on with one or more high-scale or distributed RDBMS and Big Data platforms (SAP HANA, IBM Netezza/Big Insights Infosphere, HP Vertica, Oracle Exadata and Big Data Appliance)
  • Hands-on experience with ETL (Extract-Transform-Load) tools (e.g. Informatica, Talend, Pentaho)
  • Hands-on experience with BI tools and reporting software (e.g. SAP BI, Microstrategy, Pentaho)
  • Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems
  • Knowledge of leading edge statistical software packages (e.g. SAS, SPSS, KXEN)
  • Knowledge of Hadoop and NoSQL platforms (e.g. key-value stores, graph databases, RDF triple stores)
  • Above average interpersonal, communication and presentation skills – must be able to explain technical concepts and analysis implications clearly to a wide and non-technical audience
  • Self-starter, with a keen interest in technology and highly motivated towards success
  • Ability bring clarity to ambiguous situations
  • Proven ability to work in fast-paced, agile environments

Published by Aryan Nava

Founder of "BlockchainMind", CTO for two Blockchain startup during 2018, Cloud/DevOps Consultant and Blockchain Trainer

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: