123



Job Opening :: Big Data Engineer

30 September, 2020

Title: BIG DATA ENGINEER

Qualification: Masters/Bachelor’s Degree in CS/IT/Engg/Buss. /Math/Sci./Management or equiv.

Minimum experience required: Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 3 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp.

Skills: Knime, Python Keras, Spark MLlib, Tensor Flow, Weka, NLTK, Scikit-learn, PyTorch. Implementation of solution architecture in Hadoop ecosystem (Hadoop, HortonWorks, CDH, GCP, AWS), Map Reduce, Pig, Hive, Tez, Spark, Phoenix, Presto, Hbase, Accumulo, Storm, Kafka, Flume, Falcon, Atlas, Oozie, Ambari, Hue, Security – Kerberos, Ranger, Knox, AD/LDAP

Date: 09/30/2020

Location: Hoffman Estates IL

Pay rate: $115,378.00/year
Length: Long term/permanent

Job Duties: As Big Data Engineer, the employee will be responsible for the following job duties:

• Work with various tools and programming languages like Hadoop Development, Python R, Nginx, Gunicorn
• Work on different Hadoop eco system tools like HDFS, YARN, Hive, Spark MLlib and HBase.
• Work on building analytical models on a Big data platform
• Work closely with other data and analytics team members to optimize the company’s data systems and pipeline architecture
• Design and build the infrastructure for data extraction, preparation, and loading of data from a variety of sources using technology such as SQL and AWS
• Build data and analytics tools that will offer deeper insight into the pipeline, allowing for critical discoveries surrounding key performance indicators and customer activity
• Should write Complex scripts using various languages like Python Keras, Java or Scala.
• Install and configure Hadoop ecosystem components like Map Reduce, Hive, Pig, Sqoop, HBase, Zookeeper and Oozie.
• Work on TensorFlow, Weka, NLTK, Scikit-learn and PyTorch
• Work on testing environments using HDFS, Hive, Pig and Map Reduce access for the new users.
• Work on architecture for solutions implementation on big data.
• Work on Amazon Web Services, AWS command line interface, and AWS data pipeline.
• Work with popular Hadoop distribution like Hortonworks, CDH, GCP.
• Work on Data Science Development using Knime, Python Keras,
• Work independently with Hortonworks support for any issue/concerns with Hadoop cluster.
• Implement Hadoop Security using Kerberos, Ranger, Knox, AD/LDAP on Hortonworks Cluster.
• Implement Hadoop eco system MapReduce, HDFS, HBase, Zookeeper, Pig, HUE and Hive, Kafka, Flume, Falcon, Atlas, OoZie, Ambari.
• Work on Pheonix, Presto, Tez, Accumulo, & Storm

The contact details are:

Santosh Srivastava
Only IT Consulting, LLC
2200 W Higgins RD STE 315,
Hoffman Estates IL 60169

Travel required: 0 to 25%
Telecommute: no


Job Opening : Data Warehouse Analyst

09 June, 2020

Title: Data Warehouse Analyst

Qualification: Masters/Bachelor’s Degree in CS/IT/Engg/Buss. /Math/Sci./Management or equiv.

Minimum experience required: Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 1 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp.

Skills: Analyze, Design, Implement & Process Data using Data warehouse tools like IBM Netezza, Oracle, MSSQL, Squirrel, Teradata, DB2. ETL mappings with Informatica Power Center (IPC). Create Data sets, Dashboards using SAS, RStudio, Work on IBM Watson Analytics Advantage Suite Report Writer, crosstab, record listing and distribution lists reports. Create interactive dashboards in Tableau, Gephi, IBM Watson Analytics Health Insights Explorer, SAP BO

Date: 06/09/2020

Location: Hoffman Estates IL

Pay rate: $96,325.00/year
Length: Long term/permanent

Job Duties: As Data Warehouse Analyst, the employee will be responsible for the following job duties:

• Analyze, Design, Implement and Process data using Data Warehousing and Data Integration Solutions using Informatica Power Center and IBM Netezza.
• Develop Jobs/scripts for extracting, transforming, loading data into Teradata Data Warehouse tables from source systems such as Oracle, MSSQL database tables and sequential files, Delimited files using the Informatica Power Center, Teradata Utilities.
• Analyze, Design, Develop, and Implement ETL process using various Teradata utilities like TPT, BTEQ, FLOAD, MLOAD, TPUMP and FEXPORT
• Responsible for Error Handling & Debugging and implement various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows in Informatica ETL mappings.
• Work with ETL mappings with Informatica Power Center.
• Perform performance tuning on ETL and database environments.
• Work on Oracle, MSSQL, Squirrel scripts to configure, perform and validate database operations using ETL tools.
• Work on Data sets, Dashboards using SAS and RStudio.
• Develop, maintain, and manage advance reporting, analytics, dashboards and other BI solutions.
• Perform and document data analysis, data validation, and data mapping/design.
• Plan and implement automated bursting of reports via publication services and Scheduling
• Visualize and analyze large networks graphs by using Gephi.
• Create SAP Business Objects infrastructures, installing upgrades, supporting authentication requirements, and monitoring users and reports.

 

The contact details are:

Santosh Srivastava
Only IT Consulting, LLC
2200 W Higgins RD STE 315,
Hoffman Estates IL 60169

Travel required: 0 to 25%
Telecommute: no


Job Opening : Systems Engineer (Backup & Recovery)

06 May, 2020

Title: Systems Engineer (Backup & Recovery)

Qualification: Masters/Bachelor’s Degree in CS/IT/Engg/Buss. /Math/Sci./Management or equiv.

Minimum experience required: Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 1 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp.

Skills: Backup, Recover, Install, Configure, Maintain, Automate & Secure Backup Storage Projects Using Commvault, Veritas NetBackup, EMC, NetApp, San, Nas, Rubrix, VMware, AWS, NDMP, My-Sql, Sun storage, Hadoop, Unix, Linux, Windows, Lun, Oracle Tape Libraries (SL150, SL500, SL3000, Sl8500), Raid, Disaster Recovery, ESX Cluster, SCCM, Diskpool, Backup Agent, Recovery, Datacenter Architecture

Date: 05/06/2020

Location: Hoffman Estates IL

Pay rate: $87,901.00/year
Length: Long term/permanent

Job Duties: As Systems Engineer (Backup & Recovery), the employee will be responsible for the following job duties:

• Work on various Backup software technology, using VERITAS NETBACKUP, EMC Data Domain, NETAPP, RUBRIK, VMWARE, Sun Storage Tek Tape Library, AWS.
• Configure, Manage and troubleshoot the application backups and restore;
• Perform the refresh activity with the help of application team using WINDOWS, UNIX, ORACLE, MS-SQL, MY-SQL, MS-EXCHANGE, VMWARE, NDMP-Network Data Management Protocol.
• Manage SAN and NAS environment storage Designing, storage Allocation, LUN Masking, Zoning, Load Balancing, Failover configuration, Replication Management, Performance Monitoring, Problem determination and Capacity Planning.
• Install and configure Hadoop ecosystem components and responsible for handling Hadoop clusters Translation of functional and technical requirements into detailed architecture and design.
• Install, Configure, Maintain, Automate & Secure Backup Storage Projects Using Commvault,
• Work on Automating & Securing backup storage on Linux / Windows / Unix OS.
• Configure Oracle Tape Libraries (SL150, SL500, SL3000, Sl8500) and manage large Data center operations.
• Upgrade Backup Appliance software, apply EEB's and update RAID, BIOS Firmware version.
• Configure and setup Disaster Recovery scenario and recover data in different data center.
• Manage VMware vCenter, ESX clusters, Storage VMotion and live migration.
• Develop process and methodology for test, deployment and maintenance to ensure scalability, consistency and maintainability to decrease delivery time and increase availability.
• Work on technical issues Robot down, Diskpool Full, SLP backlog, Media server’s performance issues, replication jobs, slow backups, and restores
• Develop Backup Client Deployment SCCM package and Install Backup agent on client servers using the SCCM template.

 • Work on Business continuity planning (BCP), Disaster recovery planning (DR), Business Impact Analysis (BIA), performance analysis and business trend analysis.

• Manage the delivery of Corporate Backup and archival services for all key systems including planning, design, development and management of appropriate processes ensuring best practice.
• Provide regular and effective progress updates to and work closely with Backup & Recovery Project Managers to ensure the management of any delivery risks or issues
• Manage complete lifecycle for data protection services infrastructure Backup & Recovery, Datacenter Architecture Designs Solutions, Implementations, Migrations.
• Analyze capacity data and develop capacity plans for appropriate level enterprise-wide Systems.
• Develop project plans with clearly defined milestones and maintain project status.
• Coordinate hardware replacements with external vendors

The contact details are:

Santosh Srivastava
Only IT Consulting, LLC
2200 W Higgins RD STE 315,
Hoffman Estates IL 60169

Travel required: 0 to 25%
Telecommute: no


Job Opening : SAN Engineer

22 April, 2020

Title: SAN Engineer

Qualification: Masters/Bachelor’s Degree in CS/IT/Engg/Buss. /Math/Sci./Management or equiv.

Minimum experience required: Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 3 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp.

Skills: EMC Symmetrix (DMX and VMAX, VMAX3, VNX, Unity, Isilon, VPLEX, XtremIO, SCALEIO, VxRAIL, VxRACK, IBM XIV, IBM Storwize V7000/V5000, HP 3PAR, Netapp 7 Mode and Cluster Mode, Pure, Data Domain. Data migration using Dobi Miner, SRDF, RecoverPoint, OpenReplicator, SAN Copy, Time Finder, Powerpath Migration Enabler involving UNIX and Windows based hosts and clusters. Conversant with monitoring technologies like ViPR, VIPR SRM, SVC, GRAFANA, Splunk. Administration of Cloud Technologies Azure, AWS, Open stack and other Devops technologies like Chef, Jenkins, GlusterFS, CEPH, Actifio, Docker, Kubernetes.

Date: 04/22/2020

Location: Hoffman Estates IL

Pay rate: $102,440.00/year
Length: Long term/permanent

Job Duties: As SAN Engineer, the employee will be responsible for the following job duties:

• Responsible for maintaining and managing storage environment with IBM XIV, Pure Storage.
• Manage EMC VMAX3 eNAS, VMAX, VNX, Vplex, XtremIO, Unity, Isilon using EMC Unisphere, XIO and SRM.
• Configure Isilon Clusters and add to Active Directory and delegate permissions to object to Load balance the Isilon clusters.
• Work on utilizing NetApp7 Mode and cluster mode, NetApp Ams2500, or other Data domain Storage Systems (DSS).
• Work on AIX & Linux Administrator particularly in EMC Storage VMAX 3, Isilon, VPLEX, EMC Symmetrix, VMAX, Unity, DMX 3/4, VNX, and SRM
• Work on SAN Copy, Brocade switches, virtual SAN’s and connections like Fiber Channel over Ethernet (FCoE), Ethernet Scsi.
• Work on Powerpath Migration Enabler involving UNIX and Windows based hosts and clusters
• Work on Storage Area Network using SCALEIO, VxRAIL, VxRACK
• Responsible for all the Storage Application Firmware upgrades and installations.
• Install and implement new hardware and software in the environment like HCP with S10, VSP G800, HNAS 4080 6 node cluster upgrade.
• Manage HP 3Par Block storage arrays using HP 3Par Storserv Management console SMMC.
• Migrate data from EMC VNX to HP 3Par Storage array using Online Import tool.
• Work on HP 3PAR, USP, USPV, Clariion 7Mode, Hi Command 8
• Perform administration of Cloud Environment using Azure, AWS, Open stack

 • Upgrade, Install and implement Hitachi Hi-Track Storage Monitor, Storage Navigator, Hitachi Tuning Manager for Performance Monitoring and Reporting and Data domain, Hitachi Command Suites and Device Manager.

• Perform storage management using IBM Storwize V7000/V5000,
• Work on monitoring technologies like ViPR, VIPR SRM, SVC, GRAFANA, Splunk
• Responsible for handling servers using Time Finder
• Manage the File Systems, Spans, EVS and Node Configurations for the HNAS 4080.
• Create and expand the HNAS file systems, Storage pools and Dynamic pools on actual enterprise storage arrays.
• Automate cloud using Devops tools and technologies like Chef, Jenkins, GlusterFS, CEPH, Actifio, Docker, Kubernetes.
• Handle storage operational tickets from other departments for all kind of errors and issues on the storage.
• Work with Hitachi GSC/ GCC for escalations on issues and resolutions.
• Provide Data Migration using Dobi Miner, SRDF, RecoverPoint, OpenReplicator, plan and implement Host based migration using the LVM on AIX servers.
• Create weekly Brocade SAN Health Check reports on switches in the environment.
• Provide reports on the weekly basis for the higher management with Visio diagrams and Graphs.


The contact details are:

Santosh Srivastava
Only IT Consulting, LLC
2200 W Higgins RD STE 315,
Hoffman Estates IL 60169

Travel required: 0 to 25%
Telecommute: no


← Older posts
· Top  
Copyright 2011 Only IT Consulting. All rights reserved. Powered by Only IT Consulting Pvt. Ltd