123



Job Opening : Systems Engineer

22 March, 2021

Title: Systems Engineer

Qualification: Masters/Bachelor’s Degree in CS/IT/Engg/Buss. /Math/Sci./Management or equiv.

Minimum experience required: Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 1 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp.

Date: 03/22/2021

Location: Hoffman Estates IL
Pay rate: $89000.00/year
Length: Long term/permanent

Job Duties: As SYSTEMS ENGINEER, the employee will be responsible for the following job duties:

• Install, Configure, Maintain, Automate & Secure Linux, Unix, RHEL 5.x 6.x 7.x 8.x, AIX 5, 6, 7.
• Work on Red Hat Linux kernel, memory upgrades and undertake RedHat Linux Kickstart installations
• Work on HPC Cluster and security administration, backup and recovery, kernel tuning.
• Implement, configure and deploy new patches, upgrades, bug fixes on both physical and virtual machines like Red Hat Linux servers using RedHat Satellite server, Vmware.
• Work on Virtual Machines like Vmware, Configure and deploy patches, upgrades, bug fixes on both physical and virtual Red Hat Linux servers using satellite servers.
• Configure and work on servers using NAGIOS, JENKINS,
• Work on the SAN, NAS using IBM AIX, Red-Hat Linux, and RedHat Veritas Clustering
• Automate configurations and tasks using Python, Bash and PowerShell Scripting
• Work with various security components such as Secure ID Management, LDAP, SSL certificates, NIS / Identity Management and other third-party security products.
• Work with various security components such as Secure ID Management, LDAP, SSL certificates and other third-party security products.
• Configure and administer NFS, DNS, LDAP, TFTP, DHCP Server and MAIL server clients.
• Work on TCP/IP protocols for DNS and Hostname / IP resolution.
• Manage Web Logic 8.x, 9.0, 10, WebSphere 5.x, 6.x, Apache Tomcat, IDS with Snort, Splunk

Skills/Technology:

RHEL 5.x 6.x 7.x 8.x, AIX 5, 6, 7, Kernel Tuning, HPC Cluster, RedHat Satellite Server, VMware, Nagios, Jenkins. Configure and Maintain SAN, NAS, RedHat Veritas Clustering. Python, Bash Scripting, SSL, NIS / Identity Management, LDAP, TFTP, NFS, DNS, DHCP Server, TCP/IP services. Web Logic 8.x, 9.0, 10, WebSphere 5.x, 6.x, Apache Tomcat, IDS with Snort, Splunk

The contact details are:

Santosh Srivastava
Only IT Consulting, LLC
2200 W Higgins RD STE 315,
Hoffman Estates IL 60169

Travel required: 0 to 25%
Telecommute: no


Job Opening : Network Engineer

11 March, 2021

Only IT Consulting, LLC seeks Network Engineer to Design, Configure and Secure Network using Switches (STP, RSTP, PVST, MSTP, VTP, VPC, HSRP, VRRP, GLBP), Routing (OSFP, BGP, EIGRP), configuring DMVPN, IPsec tunnels. Firewalls (Cisco ASA 5500, 5500-X Series, Switches (Cisco Catalyst Series 9500, 4500, 6500, 3700, 3600, 3500, 2960), Cisco Nexus 2K, 5K, 7K, 9K series, Routers (Cisco ISR 1000, 4000 Series. ASR 1000, 9000 series) F5 Big-IP Load Balancer LTM & GTM, wireless access points (3800, 1800, 2700 series), Controllers (3500, 5500) Cisco ISE, Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 1 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp. Mail resume: 2200 W Higgins RD STE, Hoffman Estates IL 60169.


Job Opening :: Big Data Engineer

30 September, 2020

Title: BIG DATA ENGINEER

Qualification: Masters/Bachelor’s Degree in CS/IT/Engg/Buss. /Math/Sci./Management or equiv.

Minimum experience required: Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 3 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp.

Skills: Knime, Python Keras, Spark MLlib, Tensor Flow, Weka, NLTK, Scikit-learn, PyTorch. Implementation of solution architecture in Hadoop ecosystem (Hadoop, HortonWorks, CDH, GCP, AWS), Map Reduce, Pig, Hive, Tez, Spark, Phoenix, Presto, Hbase, Accumulo, Storm, Kafka, Flume, Falcon, Atlas, Oozie, Ambari, Hue, Security – Kerberos, Ranger, Knox, AD/LDAP

Date: 09/30/2020

Location: Hoffman Estates IL

Pay rate: $115,378.00/year
Length: Long term/permanent

Job Duties: As Big Data Engineer, the employee will be responsible for the following job duties:

• Work with various tools and programming languages like Hadoop Development, Python R, Nginx, Gunicorn
• Work on different Hadoop eco system tools like HDFS, YARN, Hive, Spark MLlib and HBase.
• Work on building analytical models on a Big data platform
• Work closely with other data and analytics team members to optimize the company’s data systems and pipeline architecture
• Design and build the infrastructure for data extraction, preparation, and loading of data from a variety of sources using technology such as SQL and AWS
• Build data and analytics tools that will offer deeper insight into the pipeline, allowing for critical discoveries surrounding key performance indicators and customer activity
• Should write Complex scripts using various languages like Python Keras, Java or Scala.
• Install and configure Hadoop ecosystem components like Map Reduce, Hive, Pig, Sqoop, HBase, Zookeeper and Oozie.
• Work on TensorFlow, Weka, NLTK, Scikit-learn and PyTorch
• Work on testing environments using HDFS, Hive, Pig and Map Reduce access for the new users.
• Work on architecture for solutions implementation on big data.
• Work on Amazon Web Services, AWS command line interface, and AWS data pipeline.
• Work with popular Hadoop distribution like Hortonworks, CDH, GCP.
• Work on Data Science Development using Knime, Python Keras,
• Work independently with Hortonworks support for any issue/concerns with Hadoop cluster.
• Implement Hadoop Security using Kerberos, Ranger, Knox, AD/LDAP on Hortonworks Cluster.
• Implement Hadoop eco system MapReduce, HDFS, HBase, Zookeeper, Pig, HUE and Hive, Kafka, Flume, Falcon, Atlas, OoZie, Ambari.
• Work on Pheonix, Presto, Tez, Accumulo, & Storm

The contact details are:

Santosh Srivastava
Only IT Consulting, LLC
2200 W Higgins RD STE 315,
Hoffman Estates IL 60169

Travel required: 0 to 25%
Telecommute: no


Job Opening : Data Warehouse Analyst

09 June, 2020

Title: Data Warehouse Analyst

Qualification: Masters/Bachelor’s Degree in CS/IT/Engg/Buss. /Math/Sci./Management or equiv.

Minimum experience required: Masters in CS/IT/Engg/Buss. /Math /Sci./Management or equiv + 1 Years of Experience or Bachelors in CS/IT/Engg/Buss. /Math/Sci. /Management or equiv + 5 yrs of exp.

Skills: Analyze, Design, Implement & Process Data using Data warehouse tools like IBM Netezza, Oracle, MSSQL, Squirrel, Teradata, DB2. ETL mappings with Informatica Power Center (IPC). Create Data sets, Dashboards using SAS, RStudio, Work on IBM Watson Analytics Advantage Suite Report Writer, crosstab, record listing and distribution lists reports. Create interactive dashboards in Tableau, Gephi, IBM Watson Analytics Health Insights Explorer, SAP BO

Date: 06/09/2020

Location: Hoffman Estates IL

Pay rate: $96,325.00/year
Length: Long term/permanent

Job Duties: As Data Warehouse Analyst, the employee will be responsible for the following job duties:

• Analyze, Design, Implement and Process data using Data Warehousing and Data Integration Solutions using Informatica Power Center and IBM Netezza.
• Develop Jobs/scripts for extracting, transforming, loading data into Teradata Data Warehouse tables from source systems such as Oracle, MSSQL database tables and sequential files, Delimited files using the Informatica Power Center, Teradata Utilities.
• Analyze, Design, Develop, and Implement ETL process using various Teradata utilities like TPT, BTEQ, FLOAD, MLOAD, TPUMP and FEXPORT
• Responsible for Error Handling & Debugging and implement various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows in Informatica ETL mappings.
• Work with ETL mappings with Informatica Power Center.
• Perform performance tuning on ETL and database environments.
• Work on Oracle, MSSQL, Squirrel scripts to configure, perform and validate database operations using ETL tools.
• Work on Data sets, Dashboards using SAS and RStudio.
• Develop, maintain, and manage advance reporting, analytics, dashboards and other BI solutions.
• Perform and document data analysis, data validation, and data mapping/design.
• Plan and implement automated bursting of reports via publication services and Scheduling
• Visualize and analyze large networks graphs by using Gephi.
• Create SAP Business Objects infrastructures, installing upgrades, supporting authentication requirements, and monitoring users and reports.

 

The contact details are:

Santosh Srivastava
Only IT Consulting, LLC
2200 W Higgins RD STE 315,
Hoffman Estates IL 60169

Travel required: 0 to 25%
Telecommute: no


Newer posts → ← Older posts
· Top  
Copyright 2011 Only IT Consulting. All rights reserved. Powered by Only IT Consulting Pvt. Ltd