Solutions Architect

  • Job Reference: 657828610-2
  • Date Posted: 14 July 2022
  • Recruiter: Cloudera
  • Location: Virginia Beach, Virginia
  • Salary: On Application
  • Sector: I.T. & Communications
  • Job Type: Permanent

Job Description

Job Description:

Cloudera is seeking an experienced Solutions Architect to join our Public Sector team. This key role has two major responsibilities: first to work directly with our Federal customers and partners to optimize their plans and objectives for architecting, designing and deploying Big Data environments built with the Cloudera product suite, and, secondly, to assist in building or designing reference configurations to enable our customers and influence our product direction.

The Solutions Architect will facilitate the communication flow between Cloudera teams and the customer.


- Work directly with federal customer's technical resources to devise and recommend solutions based on the understood requirements

- Analyze complex distributed production deployments, and make recommendations to optimize performance

- Able to document and present complex architectures for the customers technical teams

- Work closely with Cloudera's teams at all levels to help ensure the success of project consulting engagements with customer

- Help design and implement Big Data architectures and configurations for customer

- Drive projects with customers to successful completion

- Write and produce technical documentation, knowledge base articles

- Participate in the pre-and post- sales process, helping both the sales and product teams to develop customers' requirements

- Keep current with the Hadoop Big Data ecosystem technologies and the Cloudera suite of products

- Attend speaking engagements when needed


- More than four years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions with a Federal or IC agency

- Experience designing and deploying production large-scale Hadoop solutions

- Ability to understand and translate customer requirements into technical requirements

- Experience designing data queries against data in a Hadoop environment using tools such as Apache Hive, Apache Phoenix, Apache Spark or others.

- Experience installing and administering multi-node Hadoop clusters

- Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment

- Strong understanding of various enterprise security solutions such as LDAP and/or Kerberos

- Good understanding of network configuration, devices, protocols, speeds and optimizations

- Strong understanding of software development, debugging & profiling.

- Knowledge of structured programming languages and scripting languages

- Strong understanding with using network-based APIs, preferably REST/JSON or XML/SOAP

- Knowledge of database administration and design, along with Data Modeling with star schema, slowing changing dimensions, and/or data capture.

- Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.


- Experience with structured programming languages such as Java, Python, etc.

- Experience using streaming centric solutions such as Kafka or Flink

- Hands-on experience with Apache NiFi or Cloudera CFM.

- Experience with software automation technologies such as Ansible, etc.

- Requires Top Secret or higher level of clearance and full-scope polygraph