Смекни!
smekni.com

Internet Law And How It Affects Users (стр. 3 из 4)

The United Kingdom Government has stated that service providers who cooperate with the Safety-Net proposals will reduce their risk of prosecution by demonstrating that they have done all that can reasonably be expected to comply with the law .

A somewhat different regulatory scheme has emerged in Singapore. In that country a class licensing scheme for the Internet has been developed . Under the scheme Internet service providers must register with the Singapore Broadcasting Authority (SBA) and comply with the conditions of the class license. All licensed Internet service providers and relevant Internet Content providers are required to comply with the Internet Content Guidelines which have been developed as part of the scheme and satisfy the SBA that they have taken adequate steps to fulfil this requirement . Service providers must also develop Acceptable Use Policies for Internet account holders and comply with any codes of practice as the SBA issues from time to time.

In Malaysia, a major source of regulation to date has been the conditions of service pertaining to the state owned Joint Advanced Research Integrated Networking (JARING) network. These set out the rules applying to Internet services and include the provisions that those using the network shall not use it for any activities not allowed under any law of Malaysia . Until recently the JARING was the only service provider in Malaysia but recently a second operator, Malaysia Telekom Berhad, is reported to have begun trading in Malaysia .

Legislation Relating to On-line Content

As discussed earlier, most countries have some laws which makes it an offence to publish and/or distribute certain material. In some countries it is also an offence to possess certain material, such as child pornography. In some cases existing legislation regarding the publication, distribution and possession of certain types of content has been drafted in a manner which enables it to encompass material transmitted and accessed on-line. However, in other jurisdictions specific legislation to deal with illegal content in the on-line environment has been proposed or introduced.

The pilot study is examining some of the legislation which applies to on-line services in each of the four countries and the types of material which may be prohibited within those jurisdictions. In regard to the type of material which may be illegal in the countries studied the pilot study has, not surprisingly, identified some significant variations in the type of material which would be illegal in each of the countries studied. These variations reflect the range of different political, cultural and religious values which may function in the different jurisdictions.

For example, in Singapore licensees are required to use their best efforts to ensure that their services does not include Internet content which is ‘against the public interest, public order, national harmony or which offends against good taste or decency’. The scheme identifies particular Internet content which falls within this category. This includes, inter alia, content which propagate permissiveness or promiscuity , tends to bring the Government of Singapore into hatred or contempt, or which excites disaffection against the Government of Singapore. In contrast it seems that at least some content of this nature would not be illegal in the United Kingdom and Australia.

The pilot study will also examine potential liability for illegal content, and consider whether in addition to the content provider, any other party, such as service providers and/or users may incur liability and/or obligations in respect of certain material. In this regard we have found that in Singapore, service providers are required under the class licensing scheme to block access to objectionable sites as directed by the SBA and observe the SBA’s content guidelines for subscribing to newsgroups .

In the United Kingdom codes of practice are being utilised to place obligations on service providers to deal with, and where possible, remove illegal content from their servers once it has been identified. In exchange for complying with these procedures service providers may receive some protection from prosecution . In Australia the proposed regulatory regime also envisages that codes of practice for service providers will include appropriate procedures to deal with illegal content if and when it has been brought to a service providers attention. It has been suggested that compliance with codes of practice could be a defence in any prosecution against a service provider.

E-mail Hotlines

One of the recent responses to the issue of illegal content has been the setting up of reporting agencies, often referred to as ‘e-mail hotlines’. These agencies enable Internet users who locate content which they believe is illegal to report their finding to a reporting agency which will investigate the content and may take action in relation to the material. To date e-mail hotline services have been identified in Singapore and the United Kingdom.

In the United Kingdom an e-mail hotline has been set up under the Safety-Net Initiative. The Hotline service is operated by Internet Watch and aims to enhance the enormous potential of the Internet to inform, educate, entertain and conduct business by … hindering the use of the Internet to transmit material which is illegal in the UK . The first priority of the scheme is child pornography.

Internet Watch operates by encouraging Internet users to report content which they believe to be child pornography. Reports can be made by telephone, facsimile or e-mail and should contain a brief description of the material in question and the location of the site (i.e the world wide web address or the Usenet newsgroup details) . Upon receiving a report Internet Watch examines the material and if it considers it illegal will seek to trace its origins. Where the material has been identified as being sourced within the United Kingdom then Internet Watch will request that the relevant service provider contact the person who has placed the material on-line and seek to have the material removed. If material is not removed and/or if it involves child pornography, then Internet Watch will forward details to the Police. Under the scheme the police are to provide feedback to Internet Watch on the outcome of their report. Where the material is sourced from outside of the United Kingdom then Internet Watch will forward to the police the information about the site who will notify the relevant overseas law enforcement agency.

In Singapore, as part of its regulatory regime established in July 1996, the Singapore Broadcasting Authority has also set up a ‘hotline’ for identifying ‘objectionable content’. The SBA has stated that the success of Internet content will depend very much on industry self-regulation and community action and has welcomed public assistance in the identification of material which is considered ‘objectionable’ under the laws of Singapore. Reports can be made by e-mail, telephone or post .

In Australia the Australian Broadcasting Authority in its Report on the Content of On-Line Services supported in principle the establishment of e-mail hotlines for the reporting of child pornography. The ABA understands that the police bodies and law enforcement bodies are presently examining the viability of establishing such an agency in Australia.

The Protection of Minors in the On-Line Environment

Many argue that parental or teacher supervision is the most effective way of managing children’s access to the Internet. However, it is widely recognised that this is not always possible and over the last decade, a range of software products have been developed for use on computers in the home and in schools in response to the perceived need for parents, teachers and supervisors to filter out or block Internet content which they do not consider suitable for minors in their care.

Filter software can be provided to the user by the on-line service provider or purchased independently by the consumer for use at the individual computer level. Some service providers also offer a filtering service which is specifically designed for use by minors. Filtering programs work in three main ways:

+ sites are blocked with reference to a list of known newsgroups, world wide web sites, File Transfer Protocol (FTP) and Gopher sites. Censoring decisions may be made centrally by the software producer with sites updated regularly, and/or by users instructing their system to block a particular site(s);

+ requests for, or receipt of, information which contains specified keywords or phrases are blocked. The keywords or phrases can be specified by the software manufacturer or by the end users who can manage the blocking of web sites by entering keywords and phrases in accordance with criteria they consider appropriate; and

+ all sites are blocked except those which are specified by the supervisor (’blanket blocking’). These programs usually provide an original unblocked list of known educational sites, and the supervisor is able to unblock new sites through a password secured interface.

Some products enable parents to combine some of these functions, while others offer additional features such as auditing of all on-line activity, including listing sites accessed, restricting the amount of time spent on-line and limiting access to some services such as bulletin boards.

Whilst the first generation of filter products offer parents some tools to control their children’s access to unsuitable material, some users have found that they have a number of limitations. For example, products which are based on recognition of keywords and phrases are not able to filter explicit graphic images or deal with encrypted data or transposed text. Conversely, sites of value to the user may also be blocked if they contain specified keywords or phrases. For example, press reports stated that at one time access to the White House web site was blocked by a filter responding to the word ‘couple’.

Where software producers are responsible for the blocking of material they may restrict content which is considered acceptable to the parent/supervisor. Where filter software operates by means of ‘blanket’ blocking, access to a considerable amount of potentially valuable material would be denied and parents and teachers would in many cases have to allocate considerable resources to enable access to appropriate material.

However, there have been some recent international developments with content and selection software which have provided additional options for parents. This technology is called the Platform for Internet Content Selection (PICS). PICS was developed by the World Wide Web Consortium (W3C). W3C is a non-governmental cross-industry group which has offices in USA, France and Japan and comprises representatives of the computer, communications and content industries, as well as trade associations and public interest groups.

PICS works by establishing a series of file formats that allows independent rating services to label Internet content in a form that can be recognised by PICS enabled software. The PICS labelling conventions do not dictate the use of any particular labelling scheme or determine who should pay attention to which labels. The PICS technology allows any organisation or individual to develop a labelling system which reflects their tastes and standards and, if they wish, enable others to also use these labels. PICS compatible software can be set to block out all unrated content, or to only block content which has been rated in a certain manner.

There are a number of scenarios in which PICS labels can be used. For example, labels may be developed to indicate whether Internet content contains sexual or violent material, enabling parents to select or block the material accordingly. A teacher might label a set of materials which are relevant to an astronomy lesson and block student access to everything else during that class. Those concerned about Nazi propaganda and other hate-speech available on-line, could label material that is historically inaccurate or that promotes hate, and businesses may wish to use labels to block their employees from visiting recreational sites during hours of office hours .

Labelling can be applied directly by content providers at the time or after a site is posted on the Internet or it can be applied by a third party in accordance with an established labelling system.

The creators of PICS envisaged that it could be used to support a large number of different labelling schemes which may have a multitude of different purposes. A number of organisations have already begun to establish labelling schemes which can interact with the PICS standards, most of which have been developed in North America and aim to assist parents in managing their children’s access to Internet content. Examples are the Recreational Software Advisory Council labelling scheme for the Internet (RSACi), SafeSurf, Cyber Patrol and Net Shepherd.

One of the PICS compatible labelling schemes which has gained increasing prominence is the descriptive labelling scheme developed by the Recreational Software Advisory Council (RSAC) and is discussed below by way of example.

The RSACi labelling system addresses four matters. These are the level of violence, sex, nudity, and language, including hate language, which a site contains. Internet content can be given a rating between 0 and 4 on each of these topics. The label aims to provide a description of the content in an Internet site rather than make a judgement about the appropriateness of content for any given audience or purpose. In some ways the descriptive RSAC labels can be seen as analogous to the labels applied to foodstuffs which merely provide information about ingredients of a product, rather than vouch for its suitability for consumption.

One of the uses of a descriptive labelling schemes is that it enables parents to set their own standards about the material which they or their children access. For example, parents who are concerned about sexual content are able to set their PICS compatible software to block out material which has been given a high rating for sexual content. Other parents may be more concerned about language and/or violence and so can also can set their software to block out highly rated material of this nature. If the material has been rated then parents can use the labels to control Internet content, regardless of where they or the content is located.

Importantly, those who are not concerned to block their own access to Internet materials need not activate those tools.

A proliferation of PICS compatible labelling schemes, perhaps being developed by a range of industry, community and governmental bodies around the world could serve many functions, including giving consumers a range of choices in relation to the values and perspectives which underpin the various labelling schemes and the purposes for which they can be used. This has the ability to reflect a diversity of different cultures and values and provide individuals with real choices about the filtering software and nature of Internet content which they access.

However, for labels to be a truly effective tool for parents and supervisors many argue that a critical mass of content needs to be labelled. This raises the issue of whether it would be advantageous to develop a small number of international labelling schemes which can be widely recognised by content providers and users around the globe.

The pilot study will examine the steps taken, if any, in each of the four countries to address the issue of children’s access to unsuitable material on the Internet. This will include any programs adopted or recommendations made in relation to filtering systems and PICS compatible labelling schemes.

To date we have found that the Safety-Net initiative developed in the United Kingdom originally supported the use of the RSACi labeling scheme and proposed that service providers require their users to rate their web sites using RSACi labels. However, the ABA understands that there have been some recent developments in this regard and that Internet Watch Foundation has undertaken to work with the on-line industry in that country to develop a PICS compatible labeling scheme which meets the specific requirements of users in the United Kingdom.

In Australia labelling Internet content was given detailed consideration in the ABA’s Report into the Content of On-Line Services. While recognising that labelling schemes might have many uses, the protection of children from unsuitable material was seen as one of the primary uses. The Report recommended that an On-Line Labelling Task Force be established to encourage development of an Australian labelling scheme for on-line content which was compatible with PICS or any superior standards which might be developed.

The ABA is also conscious of the need to ensure that an Australian Labelling scheme is, as far as possible, consistent with international ratings developments. This is to ensure that Australian content providers and users of the labelling schemes are not disadvantaged in terms of both having Australian content accessed by users around the world, or accessing a wide variety of material when using utilising the labels.