How is Florida different, really, than the rest of America?
I've lived in several states, in different parts of the country. I've visited every state, virtually every major city, and almost every place Black people have a significant population.
My anecdotal, personal experience -- while still anecdotal is better than most who wanna single out Florida. Believe it or not, I could really care less about the opinions of the uninformed. My concern is that people are so easily manipulated by the stuff they see online they stop thinking.
Florida is no different than the rest of the country -- they have better weather and lower cost of living but are no more racist (rough on Black people) than New York, Texas, Illinois, Georgia, or California.