cannot access hdfs even though hadoop is (apparently) installed on CentOS server

Server Fault Asked by con on August 23, 2020

I’m attempting to install SparkBeagle on a CentOS server, following instructions on

I’ve opened an issue on their GitHub page, but I’m not getting a response. I’m just trying to follow the instructions on their webpage. The installation proceeded well, and I assume that means SparkBeagle was installed correctly, but then why can’t I access hdfs?

I’ve installed the hadoop-3.3.0 tarball into a directory, and made the changes to the .bashrc file, but I still can’t access hdfs command.

Why isn’t this package working? How can I access the hdfs command?

Add your own answers!

Related Questions

Any way to migrate AWS T3 x64 to T4G ARM?

1  Asked on November 27, 2020 by ruben-john


how to use the pool connection correctly?

0  Asked on November 25, 2020 by lucas-muoz


azure networking, azure mover failed

0  Asked on November 25, 2020 by frdric-guigui


Servicing ws:// in Node.js via Tomcat and Apache

1  Asked on November 23, 2020 by ken-y-n


change URI for tfs project

0  Asked on November 23, 2020 by akrem


Problem with symbolic links in FTP client

4  Asked on November 15, 2020 by falcata


Fibre channel multipath

0  Asked on November 11, 2020 by kab00m


NGINX Dynamic Port proxy_pass

2  Asked on November 8, 2020 by fyroc


Printer not reachable in network

2  Asked on November 4, 2020 by andrea


Ask a Question

Get help from others!

© 2022 All rights reserved. Sites we Love: PCI Database, MenuIva, UKBizDB, Menu Kuliner, Sharing RPP, SolveDir