This article present a short text analytics method that uses BERT (Bidirectional Encoder Representations from Transformers) and multivariate filter methods for selecting features. In a world where there is a lot of short written data from many different fields, it is hard to find meaningful insights. This makes it important to come up with new methods that can take into account the context and pick out the most important parts for analysis. This study goes into this area and looks at how BERT's understanding of context and the ability of multiple filters to make distinctions can work together. Through a series of carefully planned experiments, the proposed model shows that a hybrid method improves the accuracy, readability, and speed of short text analysis. The results show that predictive metrics have gotten a lot better, which proves that proposed model way will make a big difference. This work not only helps move the field of natural language processing forward, but it also has real-world uses in areas like sentiment analysis, social media tracking, and understanding customer feedback. As digital communication continues to change, the method is set up to find nuanced insights in the short textual data that is dense with information.