Remote Data Mining And Management Job In Data Science And Analytics

I need help with a penetration testing report.

Find more Data Mining And Management remote jobs posted recently Worldwide

Word Count: 1500

Research and create a narrow and deep methodology/testing procedure design based specifically around one or two selected services running on the Metasploitable target host (target 1 from LinuxZoo labs). Perform and document an experiment to penetration test these services, using the results from the techniques and tools from the testing procedure you created. Make sure to not simply replicate testing activities from the labs and instead introduce new techniques/tools informed from your own research. Comparing some tools and techniques could be a good method. Produce a reflective discussion around your testing and findings, including brief evaluation of your method, including reflection on possible impact and risk to the host owner, and some possible countermeasures. Create an associated ‘executive summary’ summarising your important findings at a management level
About the recuiter
Member since Mar 14, 2020
Prince Kunal
from Attiki, Greece

Skills & Expertise Required

Data Science & Analytics Data Mining & Management 

Open for hiringApply before - Oct 20, 2024

Work from Anywhere

40 hrs / week

Fixed Type

Remote Job

$38.08

Cost

Offer to work on this project closes in 14 days!
Are you interested in this Opportunity?

Looking for help? Checkout our video tutorial
How to search and apply for jobs

How to apply? Do you have more questions about the Job?
See frequently asked questions

Similar Projects

Expert Excel user needed!

Looking for an expert excel formula builder to assist with creating our customer Profit and loss excel sheet.

We are looking to create and Google sheet doc where we can track equipment expense, man hours, and additional expenses on a per jo...read more

Data Enginnering Databricks

need some help on writing pyspark query . email me for more info . should have very good knowledge about achieving performance as well as query will be applied on huge data more than 500M rows . Should have knowledge about parquets and delta lake

crawler/bot building website extract data direct with premission to our own in environment

we are looking for somebody who can think out of the box whe need a crawler build for couple particular website and it will be ongoing this work for us.