NY Social Media Expert Kris Ruby on Fox Business discussing Facebook’s postcard verification for ad buyers

NY Social Media Marketing Expert Kris Ruby, CEO of Ruby Media Group, was recently on Fox Business with Charles Payne discussing the Florida School Shooting Suspect’s Disturbing Social Media Posts. Kris Ruby provides Fox viewers with valuable insights for understanding the modern day challenges and solutions related to social media misuse and the prevention of online threats. During the segment, Ruby also discussed Facebook’s latest answers to Russian meddling with postcard verification for ad buyers.

To watch the full segment, click here. 

What are some current methods social media companies use to verify ad buyers and prevent election meddling?

Facebook has proposed sending postcards with verification codes to ad buyers’ physical addresses as a tech solution to confirm user identity and location. However, this method has limitations and can be circumvented, indicating the need for more robust verification systems.

Who was Nicholas Cruz and what role did social media play in the Florida school shooting?

Nicholas Cruz was the suspect in the tragic Florida school shooting. His social media activity, including threatening posts on Instagram and YouTube, indicated criminal intent. His social media posts raised concerns but were not acted upon effectively by social media platforms or law enforcement, contributing to the failure to prevent the tragedy.

Are social media companies responsible for policing harmful content on their platforms?

Yes, social media companies bear significant responsibility for monitoring and managing harmful content. However, this collective responsibility is complex and shared with users and law enforcement. Platforms must develop proactive measures and collaborate with both state and federal authorities to proactively address threats effectively.

What improvements are suggested to prevent future tragedies linked to social media?

Improvements include better coordination between social media companies and law enforcement, thorough risk assessments when threats are reported, enhanced technology for detecting harmful content, and educating users to recognize credible threats and misinformation.

Social Media Expert Kris Ruby CEO of Ruby Media Group Fox Business

 

Navigating the Complex Landscape of Social Media Responsibility

The intersection of social media, public safety, and democratic integrity is one of the defining challenges of our time. Platforms like Facebook, Instagram, Twitter, and YouTube wield enormous influence, and their ability or inability to police their networks has profound implications for democracy.

The Florida School Shooting Incident: A Case Study in Social Media’s Role

Critical warning signs missed 

One of the most distressing examples of social media’s potential dangers is the Florida school shooting involving Nicholas Cruz. His disturbing posts on Instagram and comments on YouTube raised alarm bells that many believe were either ignored or inadequately addressed by the platforms and authorities.

Cruz’s social media activity included explicit threats and indications of criminal intent. Yet, these signals failed to trigger effective intervention that could have potentially prevented the tragedy. This raises critical questions about the responsibility of social media companies to monitor and report such content.

This tragic situation illustrates the gaps in the system where social media companies, law enforcement, and intelligence agencies failed to act decisively.

Where Did the Breakdown Occur?

Ruby points out that the collective failure was between the social media platforms, intelligence agencies, and law enforcement.  The FBI was reportedly notified of Cruz’s threatening YouTube comment and a self-harm video posted on Snapchat, but no action was taken to prevent the attack. This indicates a severe breakdown in the response chain, highlighting the need for better coordination and accountability mechanisms within law enforcement agencies as well as between them and social media companies.

Proactive Risk Assessment

In the wake of these events, it is clear that no single entity can handle the complexities of social media threats alone. Kristen Ruby advocates for a better chain of command and comprehensive risk assessments when warning signs are reported.

“If something like this is reported, conduct a thorough risk assessment. What is this person’s threat level? Social media companies and law enforcement need to work together and not in silos.” – Kristen Ruby

This collaborative approach requires social media platforms to share relevant data responsibly with authorities while respecting privacy and legal boundaries. Likewise, law enforcement must be prepared to act swiftly and decisively when credible threats emerge.

Investment in advanced AI and machine learning tools for early detection of harmful content, combined with human oversight and cross-sector collaboration, could significantly improve prevention efforts.

Rapid Response Escalation 

The responsibility for preventing misuse of social media is shared among the platforms themselves, their users, and law enforcement agencies. While users must be educated to critically evaluate information, social media companies cannot evade their duty to monitor and manage content responsibly. At the same time, intelligence and law enforcement agencies must improve their responsiveness and collaboration with social media platforms.

The tragic Florida school shooting is a stark reminder of what can happen when warning signs are missed or ignored. It underscores the urgent need for better systems, technologies, and partnerships to detect and respond to threats before they escalate.

Ultimately, solving these issues requires a balanced approach that protects free speech while safeguarding public safety. It demands innovation, accountability, and a shared commitment from stakeholders to create a safer digital and physical environment.

Kristen Ruby argued that social media companies cannot abdicate their responsibility. The proliferation of fake news and disinformation is not merely a user problem but a platform problem as well. Facebook’s introduction of monitoring tools and fact-checking measures demonstrates an acknowledgment of this responsibility.

Ruby believes platforms must be held accountable for the content they distribute and should invest in better technologies and policies to prevent manipulation and harmful behavior. This includes improving algorithms, enhancing transparency, and collaborating with law enforcement.