Do any jurisdictions seriously consider reclassifying social media websites as publishers?Do any countries have Country of Origin Food Labeling laws?Did any of Ron Klain's positions in 1998-2008 period limit contributions?Can social media be regulated under the “public accomodations” clause of the Civil Rights Act?Can Germany fine social media sites over hate speech?Fatal Riot Caused by Social Media In the WestAre there any countries that consider regulating advanced artificial intelligence?Has Facebook’s (or other social network) block policy been abused to limit freedom of expression inside EU?Has there been any attempt to prevent false information from spreading? (legal fact check)Does the US government fund the media?Why are western-owned social media sites allowed and utilized in Burma?
Help with my training data
What is purpose of DB Browser(dbbrowser.aspx) under admin tool?
Why do games have consumables?
Magical attacks and overcoming damage resistance
Why do distances seem to matter in the Foundation world?
Do I need to watch Ant-Man and the Wasp and Captain Marvel before watching Avengers: Endgame?
Check if a string is entirely made of the same substring
Find a stone which is not the lightest one
Does the damage from the Absorb Elements spell apply to your next attack, or to your first attack on your next turn?
A strange hotel
What does "function" actually mean in music?
Nails holding drywall
Crossed out red box fitting tightly around image
How long after the last departure shall the airport stay open for an emergency return?
What *exactly* is electrical current, voltage, and resistance?
How do I check if a string is entirely made of the same substring?
How do I deal with a coworker that keeps asking to make small superficial changes to a report, and it is seriously triggering my anxiety?
How exactly does Hawking radiation decrease the mass of black holes?
Suing a Police Officer Instead of the Police Department
A Paper Record is What I Hamper
How important is it that $TERM is correct?
Island of Knights, Knaves and Spies
What is the term for a person whose job is to place products on shelves in stores?
Why do real positive eigenvalues result in an unstable system? What about eigenvalues between 0 and 1? or 1?
Do any jurisdictions seriously consider reclassifying social media websites as publishers?
Do any countries have Country of Origin Food Labeling laws?Did any of Ron Klain's positions in 1998-2008 period limit contributions?Can social media be regulated under the “public accomodations” clause of the Civil Rights Act?Can Germany fine social media sites over hate speech?Fatal Riot Caused by Social Media In the WestAre there any countries that consider regulating advanced artificial intelligence?Has Facebook’s (or other social network) block policy been abused to limit freedom of expression inside EU?Has there been any attempt to prevent false information from spreading? (legal fact check)Does the US government fund the media?Why are western-owned social media sites allowed and utilized in Burma?
The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.
Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.
Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.
Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.
Q. Are any jurisdictions seriously considering relaxing this kind of provision?
regulation social-media fake-news
|
show 5 more comments
The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.
Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.
Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.
Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.
Q. Are any jurisdictions seriously considering relaxing this kind of provision?
regulation social-media fake-news
14
@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.
– Jeff Lambert
Apr 18 at 12:55
4
@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.
– hszmv
Apr 18 at 13:32
1
Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.
– pjc50
Apr 18 at 14:31
1
This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.
– reirab
Apr 18 at 20:56
1
A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.
– reirab
Apr 18 at 20:57
|
show 5 more comments
The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.
Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.
Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.
Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.
Q. Are any jurisdictions seriously considering relaxing this kind of provision?
regulation social-media fake-news
The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.
Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.
Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.
Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.
Q. Are any jurisdictions seriously considering relaxing this kind of provision?
regulation social-media fake-news
regulation social-media fake-news
edited Apr 18 at 13:12
Mozibur Ullah
asked Apr 18 at 12:27
Mozibur UllahMozibur Ullah
2,024820
2,024820
14
@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.
– Jeff Lambert
Apr 18 at 12:55
4
@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.
– hszmv
Apr 18 at 13:32
1
Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.
– pjc50
Apr 18 at 14:31
1
This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.
– reirab
Apr 18 at 20:56
1
A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.
– reirab
Apr 18 at 20:57
|
show 5 more comments
14
@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.
– Jeff Lambert
Apr 18 at 12:55
4
@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.
– hszmv
Apr 18 at 13:32
1
Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.
– pjc50
Apr 18 at 14:31
1
This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.
– reirab
Apr 18 at 20:56
1
A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.
– reirab
Apr 18 at 20:57
14
14
@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.
– Jeff Lambert
Apr 18 at 12:55
@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.
– Jeff Lambert
Apr 18 at 12:55
4
4
@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.
– hszmv
Apr 18 at 13:32
@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.
– hszmv
Apr 18 at 13:32
1
1
Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.
– pjc50
Apr 18 at 14:31
Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.
– pjc50
Apr 18 at 14:31
1
1
This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.
– reirab
Apr 18 at 20:56
This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.
– reirab
Apr 18 at 20:56
1
1
A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.
– reirab
Apr 18 at 20:57
A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.
– reirab
Apr 18 at 20:57
|
show 5 more comments
2 Answers
2
active
oldest
votes
Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.
The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.
If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.
Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:
Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.
[...]
Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).
It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.
Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:
Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''
The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.
It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."
add a comment |
Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:
Lords debates online news and content publishers
Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.
This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.
The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.
In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):
Whilst the case for change is clear, we also recognise that applying publisher standards of
liability to all online platforms could risk real damage to the digital economy, which would be
to the detriment of the public who benefit from them. That is why we are working with our
European and international partners, as well as the businesses themselves, to understand
how we can make the existing frameworks and definitions work better, and what a liability
regime of the future should look like. This will play an important role in helping to protect
users from illegal content online and will supplement our Strategy
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "475"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpolitics.stackexchange.com%2fquestions%2f40722%2fdo-any-jurisdictions-seriously-consider-reclassifying-social-media-websites-as-p%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.
The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.
If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.
Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:
Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.
[...]
Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).
It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.
Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:
Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''
The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.
It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."
add a comment |
Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.
The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.
If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.
Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:
Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.
[...]
Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).
It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.
Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:
Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''
The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.
It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."
add a comment |
Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.
The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.
If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.
Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:
Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.
[...]
Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).
It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.
Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:
Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''
The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.
It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."
Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.
The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.
If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.
Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:
Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.
[...]
Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).
It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.
Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:
Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''
The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.
It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."
edited Apr 18 at 13:27
answered Apr 18 at 13:18
Jeff LambertJeff Lambert
10.6k52951
10.6k52951
add a comment |
add a comment |
Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:
Lords debates online news and content publishers
Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.
This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.
The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.
In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):
Whilst the case for change is clear, we also recognise that applying publisher standards of
liability to all online platforms could risk real damage to the digital economy, which would be
to the detriment of the public who benefit from them. That is why we are working with our
European and international partners, as well as the businesses themselves, to understand
how we can make the existing frameworks and definitions work better, and what a liability
regime of the future should look like. This will play an important role in helping to protect
users from illegal content online and will supplement our Strategy
add a comment |
Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:
Lords debates online news and content publishers
Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.
This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.
The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.
In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):
Whilst the case for change is clear, we also recognise that applying publisher standards of
liability to all online platforms could risk real damage to the digital economy, which would be
to the detriment of the public who benefit from them. That is why we are working with our
European and international partners, as well as the businesses themselves, to understand
how we can make the existing frameworks and definitions work better, and what a liability
regime of the future should look like. This will play an important role in helping to protect
users from illegal content online and will supplement our Strategy
add a comment |
Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:
Lords debates online news and content publishers
Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.
This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.
The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.
In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):
Whilst the case for change is clear, we also recognise that applying publisher standards of
liability to all online platforms could risk real damage to the digital economy, which would be
to the detriment of the public who benefit from them. That is why we are working with our
European and international partners, as well as the businesses themselves, to understand
how we can make the existing frameworks and definitions work better, and what a liability
regime of the future should look like. This will play an important role in helping to protect
users from illegal content online and will supplement our Strategy
Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:
Lords debates online news and content publishers
Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.
This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.
The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.
In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):
Whilst the case for change is clear, we also recognise that applying publisher standards of
liability to all online platforms could risk real damage to the digital economy, which would be
to the detriment of the public who benefit from them. That is why we are working with our
European and international partners, as well as the businesses themselves, to understand
how we can make the existing frameworks and definitions work better, and what a liability
regime of the future should look like. This will play an important role in helping to protect
users from illegal content online and will supplement our Strategy
edited Apr 18 at 13:23
answered Apr 18 at 13:12
JJJJJJ
7,82832965
7,82832965
add a comment |
add a comment |
Thanks for contributing an answer to Politics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpolitics.stackexchange.com%2fquestions%2f40722%2fdo-any-jurisdictions-seriously-consider-reclassifying-social-media-websites-as-p%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
14
@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.
– Jeff Lambert
Apr 18 at 12:55
4
@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.
– hszmv
Apr 18 at 13:32
1
Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.
– pjc50
Apr 18 at 14:31
1
This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.
– reirab
Apr 18 at 20:56
1
A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.
– reirab
Apr 18 at 20:57