Sr.Security Consultant

Apply Now

Company: Gallega Software Solutions Inc

Location: Hartford, CT 06106

Description:

Company Description

Job Description

Job Details below:

2) Environment Review

a) Verify OS and file system configuration

b) Review planned HDP environment and architecture

c) Discuss technology choices best suited to implement high-priority use-cases

3) Install Production Cluster

a) Install Ambari and HDP on up to 28 nodes

b) Configure cluster parameters based on hardware specification

c) Enable high availability mode for HDP components

(1) HDFS NameNode

(2) YARN ResourceManager

(3) Hive metastore

(4) Falcon

(5) Oozie

(6) Ambari

4) Install non-Prod Cluster

a) Mentor customer in installing a non-Prod HDP cluster on up to 28 nodes

5) Performance tuning

a) Run benchmarking tools for establishing an initial baseline

b) Use the available scripts to execute HIVE queries specific to customer environment

c) Iteratively apply configuration changes to maximize system performance

d) Automate platform benchmarks to execute daily

e) Develop dashboards to visualize benchmark results over time

6) Additional cluster hardening

a) Simulate HDFS/Hive/YARN failure scenarios and document corrective actions

b) Modify log4j properties for each component to delete old log files

(1) HDFS

(2) Ranger

(3) YARN

(4) Oozie

(5) Zookeeper

(6) Ambari

(7) Solr

(8) Hive

c) Automate hdfs landing zone and tmp directory cleanup

d) Check HDP health on a daily basis:

(1) Run service check on every HDP component before the start of day through Ambari REST API

(2) Store results into metrics database

(3) Visualize results over time

e) Automate backups of:

(1) HDFS snapshots

(2) Configuration files

(3) Ambari DB

(4) Hive metastore DB

(5) Oozie DB

7) Security Workshop

a) Conduct interviews regarding your desired level of security and control choices

b) Document current and future security state

c) Work with customer to determine users and groups who require access

d) Review currently enabled security features

e) Determine security needs for production and non-production environments

f) Document current state and high-level steps needed to attain future state

8) Connect HDP to centralized authentication service

a) Install and configure Apache Ranger

b) Enable Ranger to run in highly available mode

c) Enable HDP connectivity to LDAP provider such as Active Directory, FreeIPA, OpenLDAP or another provider

d) Create Kerberos principals and keytabs to authenticate HDP services

e) Create Kerberos principals for the users of HDP by working with a member of the Customer's security team

9) Create Authorization policies

a) Implement permissions to control read/write authority to directories in HDFS

b) Restrict read and write access to tables stored in Hive

c) Isolate workloads in YARN by mapping users and groups to differing queues

10) Enable centralized audit collection

a) Install Solr for audit log indexing and search capability

b) Enable centralized audit log collection for applicable HDP components

c) Develop dashboard to show number of granted/denied access attempts by time

d) Visualize number of denied attempts by HDP component

11) Enable data-at-rest encryption

a) Install and configure Ranger Key Management Service (KMS)

b) Enable encryption for up to five (5) individual directories in HDFS as identified by customer

12) Additional security work

a) Assist in enabling masking of data as it is accessed through Hive

b) Develop sample scripts showing how to connect to a Hive table with Kerberos authentication

Qualifications

Additional Information

All your information will be kept confidential according to EEO guidelines.

Similar Jobs