Ethics for the Information Age   --   Discussion Project on Responsibility for Online Content

Lately there have been calls for the companies that operate social media platforms like Facebook, Instagram, and Twitter (now X) to take more responsibility for monitoring and restricting the behavior of their users.  And those companies have begun to respond by closing users accounts in some cases, and deleting or demoting content in other cases. Still there is a fairly widespread perception that these efforts have been inadequate and that the tech companies that own and operate social media platforms are unlikely to do better unless compelled to do so by laws.

This project asks you to look carefully at a specific set of proposals: the so-called STAR framework put out by the Center for Countering Digital Hate, online here: https://counterhate.com/wp-content/uploads/2022/09/Copy-of-STAR-Framework-for-website.pdf

Discuss and try to agree on answers to the following questions:

1. Try to figure out as much as you can exactly what each of their proposals would involve: what legislation would be needed? what rules would be put in place? who would be responsible for implementing and/or enforcing those rules? etc.

2. Which elements, if any, of their proposals are acceptable to you? Why (i.e., for what reasons)?

3. Which elements seem most problematic or, perhaps, unacceptable? Why?

 

In the interest of making sure each part of this rather large package gets some attention:

1. if you are in group number 1, please start with the proposals about safe design and algorithmic transparency (Pages 7-13);

2. if you are in group number 2, please start with the proposals about transparency with respect to rules and economics (business models and advertising) (pages 13-18)

3. if you are in group number 3, please start with the proposals about accountability and responsibility (pages 19-21).

If you finish answering all three questions for the proposal set that you start with, move on to another set of your choice.