The findings by Nexgate included:
- The average Fortune 100 firm has 320 social media accounts, 213,539 followers, and 1,159 employees making more than 500,000 posts to those accounts.
- Employees were responsible for 12 of the average 69 unmoderated compliance incidents per firm, while public commenters were responsible for the other 57.
- Nine different U.S. regulatory standards triggered incidents, including retail communications and customer response from the Financial Industry Regulatory Authority (FINRA), regulation Z from the Federal Financial Institutions Examination Council, Federal Trade Commission regulations on sweepstakes and regulation FD from the Securities and Exchange Commission.
- Financial-services firms, which make up 21 of the Fortune 100, accounted for more than 5,000 incidents, or more than 250 each.
- Food and Drug Administration adverse drug experience risks represented the most common social compliance risk for healthcare companies, with 98 incidents.
- Only 47 percent of posts by Fortune 100 brands were made via marketing and content-publishing applications.
Nexgate said in its report:
Informal culture, pace, scale and complexity separate the social media compliance challenge from more static public communications channels such as press, website and print. Ignoring these differences can overwhelm compliance staff and become a barrier to social success. As social programs grow within any organization, compliance staff needs to consider more dynamic, automated compliance processes.
Nexgate also provided the following suggestions on how to avoid compliance issues:
The first step in in building a successful social compliance program is establishing the core team responsible for compliance. Social media compliance requires coordination between groups. The team should include social users (marketing, support, sales, etc.), compliance and information security team (since this committee may also be leveraged to ensure social media security.) The primary role of this cross-functional team is to assign clear roles and responsibilities within the organization for policy, training, enforcement and audit.
Develop a social media security and compliance policy covering approved business use, content and publishing workflow.
Approved business use: Define approved social account types and business uses. Is brand representation limited to corporate marketing accounts, or is approved usage extended to executives, sales, support and general employees? What business purpose is approved for each group (marketing, employee advocacy, prospecting, recruiting, etc.)? Which accounts are used for material earnings disclosures (if any)? Policy should also cover which social networks (Twitter, LinkedIn, etc.) are approved for each account type.
Content: Define what content is allowed and not allowed to be posted for each required regulation. Policy should also extend beyond compliance to cover security-related content (e.g. malware, scams, phishing) and acceptable use (profanity, hate, intolerance, etc.). Content policy should consider both brand employee and public commenter communications. Content policy may vary for different account types (SEC disclosure accounts, etc.)
Publishing workflow: Define the publishing process employees are expected to use for corporate accounts. Policy should define which publishing tools should be used and when content should be reviewed. For example, FINRA requires that static brand content (profile data, major announcements, etc.) be reviewed before posting, while interactive conversations (social selling by brokers, etc.) may be audited after publishing.
Readers: What did you think of the findings by Nexgate?
Compliance meter image courtesy of Shutterstock.