Encryption is a Technique to convert plain text into Cipher text, which is unreadable without an appropriate decryption key. Hadoop is a platform to process and store huge amounts of data reliably. It also provides a flexible service for big data through HDFS storage. Inappropriately, Hadoop's lack of built-in security measures makes it more likely that hostile attacks will be made against the data processed or stored using Hadoop. So, the data stored in Hadoop is a big, challenging task. The prevalence of private data attacks underscores the critical need for robust Encryption techniques. In this article, we have used Attribute-based encryption based on Cipher text policy attributes encryption. Attribute-based Honey Encryption (ABHE) is a novel encryption technique that provides an additional layer of security to data stored in a database. This paper especially focused on the file sizes encoded inside the Hadoop. Hadoop is good for handling big data but it lacks security, making data vulnerable to attacks. To fix this, we have created Attribute Honey Encryption (ABHE), a method to encode and decode files securely in HDFS, and compared work with the CP-ABE algorithm and KP-ABE algorithm. ABHE works well even with large files and performs better. This makes Hadoop safer for storing and processing data.