PdfPrep.com

You want to understand more about how users browse your public website. For example, you want to know which pages they visit prior to placing an order. You have a server farm of 100 web servers hosting your website.

You want to understand more about how users browse your public website. For example, you want to know which pages they visit prior to placing an order. You have a server farm of 100 web servers hosting your website.

Which is the most efficient process to gather these web servers across logs into traditional Hadoop ecosystem.
A . Just copy them into HDFS using curl
B . Ingest the server web logs into HDFS using Apache Flume
C . Channel these clickstreams into Hadoop using Hadoop Streaming
D . Import all user clicks from your OLTP databases into Hadoop using Sqoop

Answer: B

Exit mobile version