I need someone who can:
1 - Configure a Kafka Cluster (2 brokers) on CentOS 8
2 - Install the kafka-manager (CMAK) and Elastic to management / monitoring
3 - ETL in Java (json formatted log files -> Kafka -> ElasticSearch and Redis)
I will use Kafka to create a "real-time" ETL with log files input and an output to Redis and ElasticSearch
I have installed filebeat on my servers to send the log files (JSON) to Kafka, but I would like to know about other options for working with journal logs too
I need someone experienced and organized that always follow the best practices
15 freelancers are bidding on average $634 for this job
Hi, I can help you configure Kafka cluster and related things mentioned in project description. I am experienced as Linux Administrator having 9 years of experience. Thanks Ashish A.
Hi, I have 6+ years of experience in Hadoop technologies like HDFS, MapReduce, Spark, Hive, Kafka etc. as well as data mining techniques. I can complete your project Please contact me...
Hello, I am expert in Elasticsearch, in my regular job I already did what you want. A distributed syslog server with elastic as database with kafka as broker.