Index: openacs-4/packages/proctoring-support/catalog/proctoring-support.de_DE.ISO-8859-1.xml =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/catalog/Attic/proctoring-support.de_DE.ISO-8859-1.xml,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/catalog/proctoring-support.de_DE.ISO-8859-1.xml 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,214 @@ + + + + Zur Erkennung einer "Erschleichung der Pr�fungsleistung" findet eine automatisierte Online-Aufsicht statt. Dabei werden Sie, Ihr Bildschirm und Ihr Mikrofon aufgezeichnet. Ihre Daten werden nur zum angegebenen Zweck verwendet, sicher verwahrt und nicht an Dritte weitergegeben. + Ihr Browser unterst�tzt keine Mikrophon-Aufnahme. + Der Pr�fungsmodus l�uft. Schlie�en Sie das Browserfenster daher erst nach Beendigung der Pr�fung. + Die Kamera kann nicht aufgezeichnet werden, da Ihr Browser die Kameraaufnahme nicht unterst�tzt. Bitte verwenden Sie die neueste Version von Google Chrome. + Es wurde keine Kamera erkannt. Pr�fen Sie das Folgende:<ol> <li>Pr�fen Sie ob die Kamera momentan in einer anderen Anwendung l�uft. Wenn ja dann beenden Sie die Kameraaufnahme in dieser Anwendung.</li> <li>Stellen Sie sicher, dass eine Kamera installiert und aktiviert ist. �ffnen Sie den Ger�temanager. Unter Bildger�te oder Kamera wird die Kamera angezeigt. Wenn keine Kamera angezeigt wird, dann schlie�en Sie eine externe Kamera an Ihren Computer an. Wenn im Ger�temanager ein Abw�rtspfeil angezeigt wird, bedeutet dies, dass das Kamerager�t deaktiviert ist. Klicken Sie mit der rechten Maustaste auf die Kamera, um Aktivieren auszuw�hlen. Einige Kameras werden unter Kameras und Integrierte Kamera angezeigt.</li> <li>Wurde in den Systemeinstellungen der Zugriff auf die Kamera gestattet? �ffnen Sie die Systemeinstellungen Ihres Computers und gehen Sie zu \"Kamera\". Pr�fen Sie, ob der Kamerazugriff allgemein aktiviert ist. Gehen Sie auf \"�ndern\" um den Kamerazugriff zu aktivieren. Pr�fen Sie, ob Desktop-Apps auf Ihre Kamera zugreifen d�rfen. Setzen Sie den Schieberegler auf \"Ein\".</li> <li>Wenn Sie eine externe Kamera verwenden, dann pr�fen Sie, ob diese an den Computer angeschlossen ist und, sofern sie einen Schalter hat, auch angeschaltet ist.</li> <li>Stellen Sie sicher, dass die korrekte Kamera im Browser ausgew�hlt ist. Gehen Sie in Google Chrome auf: Browsereinstellungen | Datenschutz und Sicherheit | Webseite Einstellungen | Berechtigungen | Kamera und pr�fen Sie das hier ausgew�hlte Mikrofon.</li></ol> + Die Kamera kann nicht aufgezeichnet werden, da sie keine Berechtigung dazu hat. �berpr�fen Sie das Folgende:<ol> <li>Ist Ihre Kamera in den Browsereinstellungen blockiert? Gehen Sie in Google Chrome auf: Browsereinstellungen | Datenschutz und Sicherheit | Webseite Einstellungen | Berechtigungen | Kamera. Wenn der Schieberegler auf \"Blockiert\" steht, dann setzen Sie ihn auf <i>Vor</i> dem Zugriff nachfragen (empfohlen).</li> <li>Haben Sie den Zugriff auf die Kamera in der Vergangenheit einmal verweigert? Gehen Sie in Google Chrome auf: Browsereinstellungen | Datenschutz und Sicherheit | Webseite Einstellungen | Berechtigungen | Kamera. Pr�fen Sie ob unter \"Blockieren\" Links zu WU-Webseiten angef�hrt sind. L�schen Sie diese Webseiten aus der Liste �ber einen Klick auf das M�lltonnensymbol.</li> <li>Wurde in den Systemeinstellungen der Zugriff auf die Kamera gestattet? �ffnen Sie die Systemeinstellungen Ihres Computers und gehen Sie zu \"Kamera\". Pr�fen Sie, ob der Kamerazugriff allgemein aktiviert ist. Gehen Sie auf \"�ndern\" um den Kamerazugriff zu aktivieren. Pr�fen Sie, ob Desktop-Apps auf Ihre Kamera zugreifen d�rfen. Setzen Sie den Schieberegler auf \"Ein\".</li></ol> + Nochmals pr�fen + Wenn die automatisierte Online-Aufsicht aktiviert ist, erstellt das System Aufzeichnungen von den Teilnehmer/innen und deren Bildschirm. + Der Desktop kann nicht aufgezeichnet werden, da Ihr Browser die Bildschirmaufnahme nicht unterst�tzt. Bitte verwenden Sie die neueste Version von Google Chrome. + Der Desktop kann nicht aufgezeichnet werden da ihm dieses nicht erlaubt wird. Pr�fen Sie das Folgende:<ol> <li>Vergewissern Sie sich, dass Sie in den Systemeinstellungen und Browsereinstellungen Ihrem Browser die Erlaubnis Ihren Bildschirm aufzunehmen erteilt haben.</li> <li>�berpr�fen Sie die Einstellungen Ihres Antivirenprogramms und deaktivieren Sie den Adblocker in Google Chrome.</li> <li>L�schen Sie die Cache in den Einstellungen von Google Chrome.</li></ol> + Der Desktop kann nicht aufgezeichnet werden, da Ihr Browser die die Vollbildauswahl nicht unterst�tzt. Bitte verwenden Sie die neueste Version von Google Chrome. + <a href="proctoring">Liste der Aufzeichnungen w�hrend einer automatisierten Online-Aufsicht</a> + Pr�fungsmodus + <h4>Pr�fungserkl�rung: Wichtige Informationen +zur Absolvierung von schriftlichen Online-Pr�fungen</h4> +<p> + +</p> +<h5>1) Teilnahme an einer Pr�fung</h5> +<p>Die Teilnahme an +einer Pr�fung ist ausnahmslos nur mit g�ltiger +Lehrveranstaltungsanmeldung bzw. Pr�fungsanmeldung in LPIS m�glich.</p> +<p> + +</p> +<p>Eine Beurteilung der +Pr�fung erfolgt nur, wenn alle drei folgenden Bedingungen erf�llt +sind:</p> +<ul> + <li><p>Ein Foto zur + Identit�tsfeststellung wurde, entsprechend den Vorgaben, + hochgeladen, sofern vom/von der Pr�fungsverantwortlichen eine + solche Identit�tsfeststellung f�r die entsprechende Pr�fung + angek�ndigt wurde.</p> + <li><p>Der + automatisierten Online-Aufsicht wurde zugestimmt, sofern diese + vom/von der Pr�fungsverantwortlichen f�r diese Pr�fung + angek�ndigt wurde.</p> + <li><p>Die + Pr�fungserkl�rung wurde best�tigt.</p> +</ul> +<p> + +</p> +<p>Die Best�tigung der +Pr�fungserkl�rung entspricht der Entgegennahme der Pr�fung. Haben +Sie die Pr�fung entgegengenommen, aber ggf. die +"Identit�tsfeststellung" und/oder ggf. die Zustimmung zur +"Online-Aufsicht" nicht erf�llt, wird die Pr�fung mit "NICHTIG" +bewertet und der Antritt gez�hlt. Haben Sie die Pr�fungserkl�rung +nicht best�tigt, dann erhalten Sie keinen Zugang zur Pr�fung, es +erfolgt somit keine Beurteilung und der Antritt wird nicht gez�hlt +</p> +<p> + +</p> +<h5>2) Technische Voraussetzungen f�r die Teilnahme</h5> +<p>Sie sind f�r eine +st�rungsfreie Pr�fungsumgebung und das Vorhandensein der vorab +bekanntgegebenen, notwendigen technischen Voraussetzungen selbst +verantwortlich (siehe "Online-Pr�fung" bzw. "Technik +Checkliste" in Ihrer Online-Pr�fungsumgebung). Die WU kann die +st�rungsfreie Absolvierung der Pr�fung nicht f�r jeden/jede +Studierende auf dem individuellen Endger�t garantieren. +</p> +<p> + +</p> +<h5>3) Starten und Abbruch/Unterbrechung einer Pr�fung</h5> +<p>Ihre Best�tigung +dieser Pr�fungserkl�rung gilt als Entgegennahme der Pr�fung und +ist somit ein Pr�fungsantritt. Die Pr�fung wird beurteilt und auf +die Gesamtzahl der Wiederholungen angerechnet. Dies gilt auch bei +vorzeitigem Abbruch der Pr�fung bzw. wenn Sie die Pr�fung nicht +abgeben. +</p> +<p> + +</p> +<p>M�ssen Sie aufgrund +technischer Probleme (z.B. Ausfall Ihrer Internetverbindung) Ihre +Pr�fung abbrechen bzw. unterbrechen, dann wenden Sie sich bitte +unverz�glich an Ihre/n Pr�fungsverantwortliche/n.<font size="1"> +</font>Nutzen Sie dazu den, von der/dem Pr�fungsverantwortlichen +vorab kommunizierten Kanal, z.B. via E-Mail oder Microsoft Teams. +Melden Sie den Abbruch/die Unterbrechung der Pr�fung inkl. folgender +Angaben:</p> +<ul> + <li><p>Ihre + Matrikelnummer</p> + <li><p>Zeitpunkt des + Abbruchs bzw. der Unterbrechung</p> + <li><p>Ggf. Screenshot + der Fehlermeldung + </p> +</ul> +<p>Sollte Ihnen eine +Wiederaufnahme der Pr�fung m�glich sein, melden Sie dies ebenfalls +mit der Nachricht "Wiederaufnahme der Pr�fung". Es empfiehlt +sich f�r die Meldung ggf. vorab die passenden Apps auf mobile Ger�te +herunterzuladen, z.B. um die Meldung auch w�hrend eines Ausfalls der +WLAN-Verbindung schicken zu k�nnen. +</p> +<p> + +</p> +<p>In gemeldeten F�llen +von Abbr�chen/Unterbrechungen aufgrund technischer Probleme w�hrend +der Pr�fung, die au�erhalb Ihrer Verantwortung liegen, wird die +Pr�fung grunds�tzlich nicht beurteilt und nicht auf die Gesamtzahl +der Wiederholungen angerechnet. Wenn Sie trotzdem eine Beurteilung +der Pr�fung m�chten, teilen Sie das bitte der/dem +Pr�fungsverantwortlichen gleich nach der Pr�fung mit. Die +Beurteilung erfolgt nur, wenn alle Bedingungen f�r eine Beurteilung +erf�llt sind und beurteilt werden nur die Teile der Pr�fung, die +ohne technische Probleme absolviert wurden (z.B. nur bei +funktionierender Online-Aufsicht).</p> +<p> + +</p> +<h5>4) Leistungserschleichung und +Identit�tsfeststellung</h5> +<p>Der Versuch die +Beurteilung durch unerlaubte Hilfsmittel zu erschleichen (z.B. +Mobiltelefone, Zuhilfenahme nicht erlaubter Lehrunterlagen, Absprache +mit anderen Personen) f�hrt dazu, dass die Pr�fung mit "NICHTIG" +bewertet wird und der Antritt gez�hlt wird. Zudem werden Sie f�r +die Dauer von 4 Monaten ab dem Pr�fungsdatum f�r weitere +Anmeldungen zu der betreffenden Pr�fung gesperrt. +</p> +<p> + +</p> +<p>Die Abfassung einer +Pr�fung f�r eine andere Person kann strafrechtliche Folgen nach +sich ziehen und wird ausnahmslos bei der Staatsanwaltschaft +angezeigt.</p> +<p> + +</p> +<p>W�hrend der +gesamten Pr�fungsdauer wird eine automatisierte Online-Aufsicht +durchgef�hrt, sofern diese vom/von der Pr�fungsverantwortlichen f�r +diese Pr�fung angek�ndigt wurde. Dazu werden w�hrend der Pr�fung +laufend Aufnahmen von Ihnen (Kamera und Ton) und Ihrem Bildschirm +gemacht, die von den Pr�fungsverantwortlichen eingesehen werden +k�nnen. F�r die Online-Aufsicht m�ssen Sie vorab ihr Zustimmung +unter dem Punkt "Zustimmung zur Online-Aufsicht" gegeben haben. +Beim Starten der Pr�fung erteilen Sie Ihrem Browser die Erlaubnis +den ganzen Bildschirm, die Kamera und den Ton aufzuzeichnen. Die +Manipulation der Online-Aufsicht wird als Erschleichungsversuch +gewertet.</p> +<p> + +</p> +<h5>5) Erlaubte Hilfsmittel</h5> +<p>Es sind nur solche +Hilfsmittel f�r die Bearbeitung dieser Pr�fung erlaubt, die von den +Pr�fungsverantwortlichen in den Informationen zur Pr�fung unter +"Online-Pr�fung" explizit aufgelistet wurden. W�hrend der +Pr�fung d�rfen sich grunds�tzlich keine anderen Personen im selben +Raum befinden. +</p> +<p></a> + + +</p> +<p><b>Kenntnisnahme der +Informationen zur Absolvierung der Pr�fung und Entgegennahme der +Pr�fung +</b></p> +<p>Ich best�tige +hiermit, dass ich</p> +<ul> + <li>alle oben + genannten Informationen zur Absolvierung der schriftlichen + Online-Pr�fung verstanden habe, + <li>alle genannten + Bedingungen f�r die Teilnahme an der Pr�fung erf�lle, + <li>die Pr�fung + selbstst�ndig und ohne unerlaubte Hilfsmittel absolvieren werde und + <li>die Pr�fung + mit dieser Best�tigung entgegennehme, d.h. es erfolgt eine + Beurteilung und dieser Antritt wird auf die Gesamtzahl der + Wiederholungen angerechnet. +</ul> + Erlauben Sie den Zugriff auf Ihre Kamera, wenn Ihr Browser Sie dazu auffordert. Haben Sie den Zugriff erlaubt, wird Ihnen eine Vorschau angezeigt. Stellen Sie sicher, dass ihr Gesicht im Vorschaubild gut zu erkennen ist + Zugriff auf Ihre Kamera erlauben + Sie werden von Ihrem Browser aufgefordert, den Zugriff auf Ihren Bildschirm zu erlauben. Bitte w�hlen Sie zuerst den gesamten Bildschirm aus und best�tigen Sie die Erlaubnis �ber den Button "Teilen". Ihnen wird eine Vorschau Ihres Bildschirms angezeigt. + Zugriff auf Ihren Bildschirm erlauben + Erlauben Sie den Zugriff auf Ihr Mikrofon, wenn Ihr Browser Sie dazu auffordert. Haben Sie den Zugriff erlaubt, wird Ihnen eine Audiowelle angezeigt, damit Sie Ihr Mikrofon testen k�nnen. Stellen Sie sicher, dass Ihr Mikrofon nicht stumm geschaltet ist und dass sich die Audiowelle bewegt und auf Ihre Stimme reagiert. + Zugriff auf Ihr Mikrofon erlauben + Es wurde kein Mikrofon erkannt. Pr�fen Sie das Folgende:<ol> <li>Stellen Sie sicher, dass ein Mikrofon installiert ist. �ffnen Sie die Systemsteuerung Ihres Computers und gehen Sie zu Sound | Eingabe und �berpr�fen Sie, ob ein Mikrofon angef�hrt ist. Testen Sie das Mikrofon und f�hren Sie ggf. eine Problembehandlung aus.</li> <li>Wenn Sie ein externes Mikrofon verwenden, dann pr�fen Sie, ob dieses an den Computer angeschlossen ist und, sofern es einen Schalter hat, auch angeschaltet ist.</li> <li>Stellen Sie sicher, dass das korrekte Mikrofon im Browser ausgew�hlt ist. Gehen Sie in Google Chrome auf: Browsereinstellungen | Datenschutz und Sicherheit | Webseite Einstellungen | Berechtigungen | Mikrofon und pr�fen Sie das hier ausgew�hlte Mikrofon.</li></ol> + Der Zugriff auf Ihr Mikrofon wurde verhindert. �berpr�fen Sie bitte, ob in den Systemeinstellungen der Zugriff auf das Mikrofon gestattet ist. �ffnen Sie die Systemeinstellungen Ihres Computers und gehen Sie zu \"Mikrofon\". Pr�fen Sie, ob der Mikrofonzugriff allgemein aktiviert ist. Gehen Sie auf \"�ndern\" um den Mikrofonzugriff zu aktivieren. Pr�fen Sie, ob Apps und Desktop-Apps auf Ihr Mikrofon zugreifen d�rfen. Setzen Sie den Schieberegler auf \"Ein\". + Ihr Mikrofon kann nicht aufgezeichnet werden, da es keine Berechtigung dazu hat. Pr�fen Sie das Folgende:<ol><li>Ist Ihr Mikrofon in den Browsereinstellungen blockiert? Gehen Sie in Google Chrome auf: Browsereinstellungen | Datenschutz und Sicherheit | Webseite Einstellungen | Berechtigungen | Mikrofon. Wenn der Schieberegler auf \"Blockiert\" steht, dann setzen Sie ihn auf <i>Vor</i> dem Zugriff nachfragen (empfohlen).</li><li>Haben Sie den Zugriff auf das Mikrofon in der Vergangenheit einmal verweigert? Gehen Sie in Google Chrome auf: Browsereinstellungen | Datenschutz und Sicherheit | Webseite Einstellungen | Berechtigungen | Mikrofon. Pr�fen Sie ob unter \"Blockieren\" Links zu WU-Webseiten angef�hrt sind. L�schen Sie diese Webseiten aus der Liste �ber einen Klick auf das M�lltonnensymbol.</li><li>Wurde in den Systemeinstellungen der Zugriff auf das Mikrofon gestattet? �ffnen Sie die Systemeinstellungen Ihres Computers und gehen Sie zu \"Mikrofon\". Pr�fen Sie, ob der Mikrofonzugriff allgemein aktiviert ist. Gehen Sie auf \"�ndern\" um den Mikrofonzugriff zu aktivieren. Pr�fen Sie, ob Desktop-Apps auf Ihr Mikrofon zugreifen d�rfen. Setzen Sie den Schieberegler auf \"Ein\".</li></ol> + Ihr Mikrofon ist zu leise eingestellt. Bitte erh�hen Sie die Lautst�rke. Pr�fen Sie das Folgende: +<ol> + <li>�ffnen Sie die Systemsteuerung Ihres Computers und gehen Sie zu: Sound | Erweiterte Soundoptionen. Passen Sie hier ggf. die Lautst�rke an.</li> +</ol> + Ihr Browser unterst�tzt keine Kamera-, Mikrofon- oder Bildschirmaufnahmen oder Sie haben den Zugriff auf diese Ger�te verweigert. Bitte warten Sie, bis die Seite neu geladen ist und erlauben Sie den Zugriff auf Kamera, Mikrofon und Ihre Vollbildaufnahme und vergewissern Sie sich, dass Ihr Mikrofon aktiviert und nicht stumm geschaltet ist, um mit der Pr�fung fortzufahren. + Mobile Ger�te werden nicht unterst�tzt. + Falls aktiviert, wird den Benutzern w�hrend der Online-Aufsichtssitzung eine Vorschau der aufgezeichneten Eingaben angezeigt. + Automatisierte Online-Aufsicht + Aufzeichnungen + Foto + Abschlie�en + Der Desktop kann nicht aufgezeichnet werden, da Sie nicht Ihren gesamten Bildschirm ausgew�hlt haben. W�hlen Sie \"Gesamten Bildschirm\" aus. + Wichtiger Hinweis:<br><br>Auf LEARN wird angezeigt, dass Ihre Kamera momentan schwarze bzw. unlesbare Aufnahmen sendet. Dies kann f�r Sie einen ung�ltigen Pr�fungsantritt zur Folge haben.<ul><li>Vergewissern Sie sich bitte, dass Ihre Kamera funktioniert und nicht abgedeckt ist.</li><li>Stellen Sie bitte sicher, dass Ihr Endger�t an ein Netzkabel angesteckt ist und nicht im Batteriemodus l�uft.</li><li>Sie k�nnen die Kamerabild-Vorschau auf der Testseite unter <a href=\"https://learn.wu.ac.at/browser-multimedia-test\">https://learn.wu.ac.at/browser-multimedia-test</a> �berpr�fen.</li></ul> + Wichtiger Hinweis:<br><br>Auf LEARN wird angezeigt, dass Ihre Bildschirm momentan schwarze bzw. unlesbare Aufnahmen sendet. Dies kann f�r Sie einen ung�ltigen Pr�fungsantritt zur Folge haben.<ul><li>Stellen Sie bitte sicher, dass Ihr Endger�t an ein Netzkabel angesteckt ist und nicht im Batteriemodus l�uft.</li><li>Sie k�nnen die Kamerabild-Vorschau auf der Testseite unter <a href=\"https://learn.wu.ac.at/browser-multimedia-test\">https://learn.wu.ac.at/browser-multimedia-test</a> �berpr�fen.</li></ul> + Ihr Mikrofon ist stummgeschaltet. Bitte heben Sie die Stummschaltung auf. Pr�fen Sie das Folgende:<ol> <li>�ffnen Sie die Systemsteuerung Ihres Computers und gehen Sie zu: Sound | Eingabe | Audioger�te verwalten. Pr�fen Sie, ob Ihr Mikrofon unter \"Deaktiviert\" angef�hrt ist. Wenn ja, aktivieren Sie es durch einen Klick auf das Mikrofon.</li></ol> + Index: openacs-4/packages/proctoring-support/catalog/proctoring-support.en_US.ISO-8859-1.xml =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/catalog/Attic/proctoring-support.en_US.ISO-8859-1.xml,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/catalog/proctoring-support.en_US.ISO-8859-1.xml 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,213 @@ + + + + An automated exam-supervision is activated to detect any fraudulent examination performance. You, your screen and your microphone will be recorded. Your data will only be used for the stated purpose, will be stored securely and will not be passed on to third parties. + Your browser does not support audio acquisition. + The examination mode is activated. Close your browser window after the exam. + Camera cannot be recorded. The error is: + Camera cannot be recorded because your browser does not support camera recording. Please use the most recent version of Google Chrome. + No camera was recognized. Check the following:<ol><li>Check if the camera is currently running in another application. If so, stop the camera recording in this application.</li> <li>Make sure a camera is installed and activated. Open the device manager. The camera is displayed under Image devices or Camera. If no camera is displayed, connect an external camera to your computer. If a down arrow appears in the device manager, it means the camera device is disabled. Right-click the camera to select Activate. Some cameras appear under Cameras and Integrated camera.</li> <li>Has access to the camera been permitted in the system settings? Open the system settings of your computer and go to \"camera\". Check whether camera access is generally activated. Go to \"Change\" to enable camera access. Check if desktop apps are allowed to access your camera. Set the slider to \"On\".</li> <li>If you are using an external camera, check whether it is connected to the computer and, if it has a switch, is also switched on.</li> <li>Make sure that the correct camera is selected in the Browser. In Google Chrome go to: Browser Settings | Privacy and security | Website settings | Permissions | Camera and check if the correct camera has been selected.</li></ol> + Camera cannot be recorded because it has no permission to do so. Check the following:<ol> <li>Is your camera blocked in the browser settings? In Google Chrome go to: Browser Settings | Privacy and security | Website settings | Permissions | camera. If the slider is set to ?Blocked?, set it to ?Ask before access (recommended)?.</li> <li>Have you ever denied access to the camera in the past? In Google Chrome go to: Browser Settings | Privacy and security | Website settings | Permissions | Camera. Check whether there are links to WU websites under ?Block?. Delete these websites from the list by clicking on the garbage can symbol.</li> <li>Has access to the camera been permitted in the system settings? Open the system settings of your computer and go to \"camera\". Check whether camera access is generally activated. Go to \"Change\" to enable camera access. Check if desktop apps are allowed to access your camera. Set the slider to \"On\".</li></ol> + Check again + If automated exam-supervision is activated, the system will create recordings of the participants and their screen. + Desktop cannot be recorded. Error is: + Desktop cannot be recorded because your browser does not support screen recording. Please use the most recent version of Google Chrome. + Desktop cannot be recorded because it is denied the permission to do so. Check the following:<ol> <li>Make sure you granted your Browser access to record your desktop and browser in the system and browser settings.</li> <li>Check the settings of your antivirus program and deactivate the adblocker in Google Chrome.</li> <li>In Google Chrome settings delete the cache.</li></ol> + Desktop cannot be recorded because your browser does not support the enforcing of full-screen selection. Please use the most recent version of Google Chrome. + <a href="proctoring">List of recordings during an automated exam-supervision</a> + Exam mode + <h4>Examination statement: Important +information for taking online examinations</h4> +<p> + +</p> +<h5>1) Participation</h5> +<p>Without exception, +only students who are officially registered for the course and/or the +examination on LPIS will be allowed to take the exam.</p> +<p> + +</p> +<p>The exam will only +be graded if all 3 of the following conditions have been fulfilled:</p> +<ul> + <li><p>You have + uploaded a photo that fulfills the necessary criteria to confirm + your identity, if such an identification requirement was previously + announced for the examination in question by the person responsible + for the examination.</p> + <li><p>You have + consented to automated online supervision, if previously announced + for this examination</p> + <li><p>You have + confirmed that you have read and understood this examination + statement</p> +</ul> +<p> + +</p> +<p>By confirming the +examination statement, you also confirm receipt of the examination. +If you have received the examination but not fulfilled the identity +requirement and/or not confirmed your consent to online supervision +(as applicable), the examination will be declared VOID and will count +as an examination attempt. If you do not confirm the examination +statement, you will not be given access to the exam. This means that +the examination will not be graded and the examination attempt will +not be counted. +</p> +<p> + +</p> +<h5>2) Technical requirements</h5> +<p>It is your +responsibility to ensure that you will not be disturbed during the +examination and that all technical requirements (as previously +announced) are fulfilled (see "Online Exam" +and/or "Technical Checklist" in your online exam environment). WU +cannot guarantee a problem-free completion of the examination for +every student working on their individual electronic device. +</p> +<p> + +</p> +<h5>3) Starting and terminating/interrupting the +examination</h5> +<p>By confirming that +you have read and understood this examination statement, you also +confirm receipt of the examination, and the attempt will be counted. +The exam will be graded and counted towards your total number of +permissible examination attempts. This applies even if you terminate +the exam prematurely or do not submit your completed exam. +</p> +<p> + +</p> +<p>If you are forced to +terminate the exam prematurely or interrupt the exam due to technical +difficulties (e.g. loss of your internet connection), please contact +the person responsible for the examination without delay. To do that, +please use the communication channel previously announced for that +purpose by the person responsible for the examination. When reporting +the termination/interruption of your exam, be sure to include the +following information:</p> +<ul> + <li><p>Your student ID + number</p> + <li><p>Exact time of + termination/interruption</p> + <li><p>Screenshot of + the error message, if applicable + </p> +</ul> +<p>If you are able to +solve the problem and continue with the exam, please report this with +the message "Examination resumed." To be able to send such +reports in the event of technical problems, it is recommended that +you download the appropriate apps to your mobile devices before the +start of the examination. In this way, you will for example be able +to report the termination/interruption of the exam even if your Wi-Fi +connection fails. +</p> +<p> + +</p> +<p>In cases where you +report the termination/interruption of the exam due to technical +problems beyond your control, as a general rule, the examination is +not graded and the examination attempt not counted. If you would like +to have your exam graded nonetheless, please contact the person +responsible for the examination immediately after the exam and +request that your exam be graded. The examination can only be graded +if all the applicable requirements for grading are fulfilled. Only +those parts of the exam will be graded that were completed without +any technical problems (i.e. under active online exam supervision).</p> +<p> + +</p> +<h5>4) Cheating and identity confirmation</h5> +<p>Any attempt to cheat +on the exam (e.g. cell phone, consulting forbidden materials, +consulting other people) will result in the exam being declared VOID +and the examination attempt counted. You will also be blocked from +re-registering to repeat the examination for a period of 4 months +starting from the examination date. +</p> +<p> + +</p> +<p>Having someone else +take your exam for you can constitute a criminal offense. Such +instances of cheating will be reported to the public prosecutor's +office, without exception.</p> +<p> + +</p> +<p>If announced in +advance by the person responsible for the exam, automated online +examination supervision will be conducted for the duration of the +exam. This means that you and your screen will be monitored by camera +and microphone (video and audio) throughout the exam, and the +examiner will be able to see and hear the recordings. You need to +consent to this online supervision in advance under "Online +Supervision". As soon as you start the exam, you will need to give +your browser permission to access your screen, webcam, and +microphone. Any attempts to manipulate the online exam supervision +will be considered attempts at cheating.</p> +<p> + +</p> +<h5>5) Permissible aids</h5> +<p>When taking this +examination, you may use only the aids explicitly listed by the +person responsible for the examination in the exam information under +"Online exam." As a rule, no one else may be present in the same +room with you while you are working on the exam. +</p> +<p> + +</p> +<p><b>Acknowledgment of +the information on receipt and completion of the examination +</b></p> +<p>I confirm that</p> +<ul> + <li>I have read and + understood the information given above on taking this written online + examination + <li> + I fulfill all the requirements for taking this examination + <li>I will complete + the examination independently and without consulting any forbidden + aids + <li>I accept + receipt of this examination, i.e. I acknowledge that the exam will + be graded and will count towards my total number of examination + attempts +</ul> + If your browser prompts you to give access to the camera, please do so. Once permission has been granted, a preview image will appear. Please make sure your face is clearly visible in the picture before proceeding. + Grant access to your camera + You are being prompted to give access to your desktop. Please first select "full screen" and grant the access thereafter by clicking the button "share". On success you should see a preview of your desktop on the page. + Grant access to your desktop + If your browser prompts you to give access to the microphone, please do so. Once permission has been granted, an audiowave will be displayed to test your microphone. Please make sure your microphone is not muted and the audiowave is moving and reacts to your voice. + Grant access to your microphone + Microphone cannot be recorded. Error is: + No microphone was recognized. Check the following:<ol> <li>Make sure a microphone is installed. Open the control panel of your computer and click on \"Hardware and Sound\". In the window that opens, navigate to Sound | Enter and check that a microphone is listed. Test the microphone and troubleshoot it if necessary.</li> <li>If you are using an external microphone, check whether it is connected to the computer and, if it has a switch, is also switched on.</li> <li>Make sure that the correct microphone is selected in the Browser. In Google Chrome go to: Browser Settings | Privacy and security | Website settings | Permissions | Microphone and check if the correct microphone has been selected.</li></ol> + Access to your microphone has been prevented. Please check if access to the microphone has been permitted in the system settings. Open the system settings of your computer and go to \"Microphone\". Check whether microphone access is generally activated. Go to \"Change\" to enable microphone access. Check if apps and desktop apps are allowed to access your microphone. Set the slider to \"On\". + Your microphone cannot be recorded because it has no permission to do so. Check the following:<ol> <li>Is your microphone blocked in the browser settings? In Google Chrome go to: Browser Settings | Privacy and security | Website settings | Permissions | Microphone. If the slider is set to \"Blocked\", set it to \"Ask before access (recommended)\".</li> <li>Have you ever denied access to the microphone in the past? In Google Chrome go to: Browser Settings | Privacy and security | Website settings | Permissions | Microphone. Check whether there are links to WU websites under \"Block\". Delete these websites from the list by clicking on the trash symbol.</li> <li>Has access to the microphone been permitted in the system settings? Open the system settings of your computer and go to \"Microphone\". Check whether microphone access is generally activated. Go to \"Change\" to enable microphone access. Check if desktop apps are allowed to access your microphone. Set the slider to \"On\".</li></ol> + Your microphone seems to be completely silent. Please raise your recording volume. Check the following:<ol> <li>Open the control panel of your computer and go to: Sound | Advanced sound options. If necessary, adjust the volume here.</li></ol> + Your browser does not support camera, microphone or screen capturing or you denied access to those devices. To proceed with the exam, please set related browser settings back and grant access to camera, microphone and your full-screen capturing and ensure that your microphone is activated and not muted. + Mobile devices are unsupported. + Online Exam + If enabled, a preview of recorded input will be displayed to users during the online supervision session. + Automated exam-supervision + Recordings + Request to the server has failed. Retry in 10s... + Request to the server timed out. Retry in 10s... + Photo + Finish + Desktop cannot be recorded because you selected something different than full screen in your desktop selection. Select full screen in your desktop selection. + Important Note:<br><br>LEARN shows that your camera is producing black or otherwise indiscernible pictures at the moment. This could invalidate your exam attempt!<ul><li>Please make sure the camera is working and is not obstructed.</li><li>Please make sure that your device is plugged into a power cord and that you are not running on battery.</li><li>You can test your camera setup at <a href=\"https://learn.wu.ac.at/browser-multimedia-test\">https://learn.wu.ac.at/browser-multimedia-test</a></li></ul> + Important Note:<br><br>LEARN shows that your desktop is producing black or otherwise indiscernible pictures at the moment. This could invalidate your exam attempt!<ul><li>Please make sure that your device is plugged into a power cord and that you are not running on battery.</li><li>You can test your camera setup at <a href=\"https://learn.wu.ac.at/browser-multimedia-test\">https://learn.wu.ac.at/browser-multimedia-test</a></li></ul> + Your microphone is muted. Please unmute it. Check the following:<ol> <li>Open the control panel of your computer and go to: Sound | Enter | Manage audio devices. Check whether your microphone is listed under \"Deactivated\". If so, activate it by clicking on the microphone.</li></ol> + Index: openacs-4/packages/proctoring-support/lib/examination-statement-accept.tcl =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/lib/Attic/examination-statement-accept.tcl,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/lib/examination-statement-accept.tcl 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,21 @@ +ad_include_contract { + + Store acceptance of the examination statement + +} { + object_id:naturalnum,notnull +} + +auth::require_login + +set user_id [ad_conn user_id] + +::xo::dc dml -prepare {integer integer} insert { + insert into proctoring_examination_statement_acceptance + (object_id, user_id) + values + (:object_id, :user_id) +} + +ns_return 200 text/plain OK +ad_script_abort Index: openacs-4/packages/proctoring-support/lib/proctored-page.adp =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/lib/Attic/proctored-page.adp,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/lib/proctored-page.adp 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,120 @@ + + 0 + 0 + 0 + 0 + + + + + + + + + +
+

#xowf.menu-New-App-OnlineExam#

+ + + +
+
@msg.exam_mode;literal@
+
+
+ +
+
@msg.proctoring_accept@
+
+
+

#proctoring-support.grant_access_to_microphone_title#

+
#proctoring-support.grant_access_to_microphone_msg#
+ +
+
+

#proctoring-support.grant_access_to_camera_title#

+
#proctoring-support.grant_access_to_camera_msg#
+ +
+
+

#proctoring-support.grant_access_to_desktop_title#

+
#proctoring-support.grant_access_to_desktop_msg#
+ +
+
+
+
+
+ + + +
+
+ +
+ + + + + + + + + +
+
+ +
+
+ @msg.proctoring_banner@ +
+
+ +
+
+ + + Index: openacs-4/packages/proctoring-support/lib/proctored-page.tcl =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/lib/Attic/proctored-page.tcl,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/lib/proctored-page.tcl 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,98 @@ +ad_include_contract { + Embed proctoring support in a page. + + This template creates and empty page embedding specified URL in an + iframe surrounded by proctoring support. + + This kind of proctoring will take snapshots from camera and + desktop at random intervals and upload them to specified URL. + + @param exam_id Id of the exam. This can be e.g. the item_id of the + exam object. + @param exam_url URL of the actual exam, which will be included in + an iframe. + @param min_ms_interval miniumum time to the next snapshot in + missliseconds + @param max_ms_interval maxiumum time to the next snapshot in + milliseconds. + @param audio_p decides if we record audio. Every time some input + longer than min_audio_duration is detected from the + microphone, a recording will be started and terminated at + the next silence, or once it reaches max_audio_duration + @param min_audio_duration minimum audio duration to start + recording in seconds + @param max_audio_duration max audio duration in seconds. Once + reached, recording will stop and resume at the next + detected audio segment. + @param preview_p if specified, a preview of recorded inputs will + be displayed to users during proctored session + @param proctoring_p Do the actual proctoring. Can be disabled to + display just the exmaination statement + @param examination_statement_p Display the examination statement + @param examination_statement_url URL we are calling in order to + store acceptance of the examination statement. It will + receive 'object_id' as query parameter. + @param upload_url URL for the backend receiving and storing the + collected snapshots. It will receive 'name' (device name, + either camera or desktop), item_id (exam_id), and the + file. Current default URL is that which becomes available + by default once proctoring-support package is mounted and + will store the pictures in the /proctoring folder under + acs_root_dir. + @param msg an array that can be used to customize UI labels with + fields: 'missing_stream' (message to display in case + proctoring fails due to a missing stream), 'accept' (label + for the accept button), 'exam_mode' (examination statement + for the exam, to be displayed as literal), + 'proctoring_accept' (disclaimer informing users that + proctoring will happen), 'proctoring_banner' (message in + the red proctoring banner). Any of those fields can be + omitted and will default to message keys in this package. +} { + object_url:localurl + object_id:naturalnum,notnull + {min_ms_interval:naturalnum 1000} + {max_ms_interval:naturalnum 60000} + {audio_p:boolean true} + {min_audio_duration:naturalnum 2} + {max_audio_duration:naturalnum 60} + {preview_p:boolean false} + {proctoring_p:boolean true} + {examination_statement_p:boolean true} + {examination_statement_url:localurl "/proctoring/examination-statement-accept"} + {upload_url:localurl "/proctoring/upload"} + msg:array,optional +} + +set default_msg(missing_stream) [_ proctoring-support.missing_stream_message] +set default_msg(proctoring_accept) [_ proctoring-support.accept_message] +set default_msg(exam_mode) [_ proctoring-support.Exam_mode_message] +set default_msg(proctoring_banner) [_ proctoring-support.banner_message] +set default_msg(black_picture_camera) [_ proctoring-support.you_are_producing_black_pictures_from_camera] +set default_msg(black_picture_desktop) [_ proctoring-support.you_are_producing_black_pictures_from_desktop] +set default_msg(request_failed) [_ proctoring-support.request_failed] +set default_msg(request_timeout) [_ proctoring-support.request_timeout] +set default_msg(audio_grabbing_not_supported) [_ proctoring-support.audio_grabbing_not_supported] +set default_msg(camera_grabbing_not_supported) [_ proctoring-support.camera_grabbing_not_supported] +set default_msg(desktop_grabbing_not_supported) [_ proctoring-support.desktop_grabbing_not_supported] +set default_msg(your_microphone_is_muted) [_ proctoring-support.your_microphone_is_muted] +set default_msg(camera_permission_denied) [_ proctoring-support.camera_permission_denied] +set default_msg(microphone_permission_denied) [_ proctoring-support.microphone_permission_denied] +set default_msg(desktop_permission_denied) [_ proctoring-support.desktop_permission_denied] +set default_msg(camera_not_found) [_ proctoring-support.camera_not_found] +set default_msg(microphone_not_found) [_ proctoring-support.microphone_not_found] +set default_msg(microphone_not_readable) [_ proctoring-support.microphone_not_readable] +set default_msg(camera_not_readable) [_ proctoring-support.camera_not_readable] +set default_msg(wrong_display_surface_selected) [_ proctoring-support.wrong_display_surface_selected] +set default_msg(display_surface_not_supported) [_ proctoring-support.display_surface_not_supported] +set default_msg(mobile_devices_not_supported) [_ proctoring-support.mobile_devices_are_unsupported] + +foreach {key value} [array get default_msg] { + if {![info exists msg($key)]} { + set msg($key) $value + } +} + +set mobile_p [ad_conn mobile_p] +set preview_p [expr {$preview_p ? true : false}] +set proctoring_p [expr {$proctoring_p ? true : false}] Index: openacs-4/packages/proctoring-support/lib/proctoring-upload.tcl =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/lib/Attic/proctoring-upload.tcl,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/lib/proctoring-upload.tcl 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,70 @@ +ad_include_contract { + + Implements the upload backend for proctoring, which can be used as + is or inside e.g. an object method. + +} { + name:oneof(camera|desktop),notnull + type:oneof(image|audio),notnull + object_id:naturalnum,notnull + file + file.tmpfile + {notify_p:boolean false} +} + +auth::require_login + +set user_id [ad_conn user_id] + +set proctoring_dir [::proctoring::folder \ + -object_id $object_id -user_id $user_id] + +if {$type eq "audio"} { + # set mime_type [exec [util::which file] --mime-type -b ${file.tmpfile}] + set mime_type video/webm +} else { + set mime_type [ns_imgmime ${file.tmpfile}] +} +if {($type eq "image" && ![regexp {^image/(.*)$} $mime_type m extension]) || + ($type eq "audio" && ![regexp {^video/(.*)$} $mime_type m extension]) +} { + ns_log warning "Proctoring: user $user_id uploaded a non-$type ($mime_type) file for object $object_id" + ns_return 500 text/plain "KO" + ad_script_abort +} elseif {![::proctoring::active_p -object_id $object_id]} { + ns_return 200 text/plain "OFF" + ad_script_abort +} + +set timestamp [clock seconds] +set file_path $proctoring_dir/${name}-${type}-$timestamp.$extension +file mkdir -- $proctoring_dir +file rename -force -- ${file.tmpfile} $file_path + +# TODO: downstream UI implements the receiving end of this +# websocket. We should port it. +# Notify a websocket about the upload so that e.g. a UI can be updated +# in real time. +if {$notfiy_p} { + set message [subst -nocommands { + { + "user_id": "$user_id", + "name": "$name", + "type": "$type", + "timestamp": "$timestamp", + "file": "$file_path" + } + }] + + set message [::ws::build_msg $message] + + set chat proctoring-${object_id} + #ns_log warning "Sending to chat $chat" + ::ws::multicast $chat $message + + set chat proctoring-${object_id}-${user_id} + #ns_log warning "Sending to chat $chat" + ::ws::multicast $chat $message +} + +ns_return 200 text/plain OK Index: openacs-4/packages/proctoring-support/sql/postgresql/proctoring-support-create.sql =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/sql/postgresql/Attic/proctoring-support-create.sql,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/sql/postgresql/proctoring-support-create.sql 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,51 @@ + +-- Table of objects where proctoring is enabled +create table proctoring_objects ( + object_id integer + primary key + references acs_objects(object_id) on delete cascade, + enabled_p boolean not null default true, + start_date date, -- date since which proctoring can happen + -- (e.g. 2018-01-01) + + end_date date, -- date since which proctoring is closed + -- (e.g. 2018-01-02) + + start_time time, -- time of day since which proctoring can + -- happen for every day where proctoring + -- is enabled (e.g. 08:00) + + end_time time, -- time of day since which proctoring is + -- closed for every day where proctoring + -- is enabled (e.g. 20:00) + preview_p boolean not null default false, -- display a preview of recording to proctored user + proctoring_p boolean not null default true, -- do the actual proctoring + examination_statement_p boolean not null default true -- display the examination statement +); + +comment on table proctoring_objects is 'Objects for which proctoring is enabled'; +comment on column proctoring_objects.object_id is 'Object which should be proctored'; +comment on column proctoring_objects.start_date is 'Date since which proctoring can happen'; +comment on column proctoring_objects.end_date is 'Date since which proctoring is closed'; +comment on column proctoring_objects.start_time is 'Time of day since which proctoring can happen for every day where proctoring is enabled'; +comment on column proctoring_objects.end_time is 'Time of day since which proctoring is closed for every day where proctoring is enabled'; +comment on column proctoring_objects.preview_p is 'Display a preview of the recording to the proctored user'; +comment on column proctoring_objects.proctoring_p is 'Turn proctoring on/off'; +comment on column proctoring_objects.examination_statement_p is 'Display the examination statement'; + + +create table proctoring_examination_statement_acceptance ( + object_id integer not null + references acs_objects(object_id) on delete cascade, + user_id integer not null + references users(user_id) on delete cascade, + timestamp timestamp not null default current_timestamp +); + +comment on table proctoring_examination_statement_acceptance is 'Records acceptance of the examination statements for a proctored object by a user. Can be repeated.'; + +create index proctoring_examination_statement_acceptance_object_id_idx on + proctoring_examination_statement_acceptance(object_id); + +create index proctoring_examination_statement_acceptance_user_id_idx on + proctoring_examination_statement_acceptance(user_id); Index: openacs-4/packages/proctoring-support/sql/postgresql/proctoring-support-drop.sql =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/sql/postgresql/Attic/proctoring-support-drop.sql,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/sql/postgresql/proctoring-support-drop.sql 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,3 @@ + +drop table proctoring_objects; +drop table proctoring_examination_statement_acceptance; Index: openacs-4/packages/proctoring-support/tcl/proctoring-procs.tcl =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/tcl/Attic/proctoring-procs.tcl,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/tcl/proctoring-procs.tcl 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,141 @@ +ad_library { + Proctoring API +} + +namespace eval ::proctoring {} + +ad_proc ::proctoring::folder { + -object_id:required + -user_id +} { + Returns the proctoring folder on the system +} { + set folder [acs_root_dir]/proctoring/$object_id + if {[info exists user_id]} { + append folder /$user_id + } + return $folder +} + +ad_proc ::proctoring::configure { + -object_id:required + {-enabled_p true} + {-examination_statement_p true} + {-proctoring_p true} + {-preview_p false} + {-start_date ""} + {-end_date ""} + {-start_time ""} + {-end_time ""} +} { + Configures proctoring for specified object. + + @param enabled_p enable proctoring. + @param proctoring_p Do the actual proctoring. This allows to have + only the examination statement, without + actually taking and uploading pixctures/sound. + @param examination_statement_p Display the examination statement + @param preview_p if specified, a preview of recorded inputs will + be displayed to users during proctored session + @param start_date Date since which proctoring is enabled. No start + date check is performed when not specified and + proctoring will be enabled from today. + @param end_date Date since which proctoring will not count as + enabled anymore. No end date check is performed + when not specified and proctoring will not expire. + @param start_time Time of day since when proctoring is + executed. No time check when not specified. + @param end_time Time of day since when proctoring is not + executed. No time check when not specified. +} { + ::xo::dc dml insert_proctoring { + insert into proctoring_objects ( + object_id, + enabled_p, + start_date, + end_date, + start_time, + end_time, + preview_p, + proctoring_p, + examination_statement_p + ) values ( + :object_id, + :enabled_p, + :start_date, + :end_date, + :start_time, + :end_time, + :preview_p, + :proctoring_p, + :examination_statement_p + ) on conflict(object_id) do update set + enabled_p = :enabled_p + start_date = :start_date, + end_date = :end_date, + start_time = :start_time, + end_time = :end_time, + preview_p = :preview_p, + proctoring_p = :proctoring_p, + examination_statement_p = :examination_statement_p + } +} + +ad_proc ::proctoring::get_configuration { + -object_id:required +} { + Returns proctoring settings for specified object + + @return a dict with fields: enabled_p, start_date, end_date, + start_time, end_time, preview_p, proctoring_p, + examination_statement_p +} { + set start_date "" + set end_date "" + set start_time "" + set end_time "" + set enabled_p false + set preview_p true + set proctoring_p false + set examination_statement_p false + ::xo::dc 0or1row is_proctored { + select to_char(start_date, 'YYYY-MM-DD') as start_date, + to_char(end_date, 'YYYY-MM-DD') as end_date, + to_char(start_time, 'HH24:MI:SS') as start_time, + to_char(end_time, 'HH24:MI:SS') as end_time, + case when preview_p then 'true' else 'false' end as preview_p, + case when proctoring_p then 'true' else 'false' end as proctoring_p, + case when enabled_p then 'true' else 'false' end as enabled_p, + case when examination_statement_p then 'true' else 'false' end as examination_statement_p + from proctoring_objects + where object_id = :object_id + } + + return [list \ + enabled_p $enabled_p \ + start_date $start_date \ + end_date $end_date \ + start_time $start_time \ + end_time $end_time \ + preview_p $preview_p \ + proctoring_p $proctoring_p \ + examination_statement_p $examination_statement_p] +} + +ad_proc ::proctoring::active_p { + -object_id:required +} { + Returns whether proctoring is active now. + + @return boolean +} { + return [::xo::dc 0or1row -prepare integer check { + select 1 from proctoring_objects + where object_id = :object_id + and enabled_p + and (start_date is null or start_date <= current_date) + and (end_date is null or end_date >= current_date) + and (start_time is null or start_time <= cast(current_timestamp as time)) + and (end_time is null or end_time >= cast(current_timestamp as time)) + }] +} Index: openacs-4/packages/proctoring-support/www/examination-statement-accept.adp =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/Attic/examination-statement-accept.adp,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/examination-statement-accept.adp 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,3 @@ + \ No newline at end of file Index: openacs-4/packages/proctoring-support/www/examination-statement-accept.tcl =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/Attic/examination-statement-accept.tcl,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/examination-statement-accept.tcl 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,5 @@ +ad_page_contract { + Accept the examination statement +} { + object_id:naturalnum,notnull +} Index: openacs-4/packages/proctoring-support/www/upload.adp =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/Attic/upload.adp,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/upload.adp 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,7 @@ + \ No newline at end of file Index: openacs-4/packages/proctoring-support/www/upload.tcl =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/Attic/upload.tcl,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/upload.tcl 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,9 @@ +ad_page_contract { + Proctoring upload endpoint +} { + name:oneof(camera|desktop),notnull + type:oneof(image|audio),notnull + object_id:naturalnum,notnull + file + file.tmpfile +} Index: openacs-4/packages/proctoring-support/www/resources/audiowave.js =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/resources/Attic/audiowave.js,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/resources/audiowave.js 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,53 @@ +// A class implementing an audiowave that can be plugged to an audio +// MediaStream object and displayed as a canvas on the page. + +class AudioWave { + constructor( + stream, + canvasSelector = "canvas", + fillStyle = 'rgb(255, 255, 255)', + strokeStyle = 'rgb(255, 0, 0)', + lineWidth = 2 + ) { + this.audioCtx = new (window.AudioContext || window.webkitAudioContext)(); + this.analyser = this.audioCtx.createAnalyser(); + this.source = this.audioCtx.createMediaStreamSource(stream); + this.source.connect(this.analyser); + this.analyser.fftSize = 2048; + this.bufferLength = this.analyser.frequencyBinCount; + this.dataArray = new Uint8Array(this.bufferLength); + this.canvas = document.querySelector(canvasSelector); + this.canvasCtx = this.canvas.getContext("2d"); + this.fillStyle = fillStyle; + this.strokeStyle = strokeStyle; + this.lineWidth = lineWidth; + + this.draw(); + } + + draw() { + var drawVisual = requestAnimationFrame(this.draw.bind(this)); + this.analyser.getByteTimeDomainData(this.dataArray); + this.canvasCtx.fillStyle = this.fillStyle; + this.canvasCtx.fillRect(0, 0, this.canvas.width, this.canvas.height); + this.canvasCtx.lineWidth = this.lineWidth; + this.canvasCtx.strokeStyle = this.strokeStyle; + this.canvasCtx.beginPath(); + var sliceWidth = this.canvas.width * 1.0 / this.bufferLength; + var x = 0; + for(var i = 0; i < this.bufferLength; i++) { + var v = this.dataArray[i] / 128.0; + var y = v * this.canvas.height / 2; + + if(i === 0) { + this.canvasCtx.moveTo(x, y); + } else { + this.canvasCtx.lineTo(x, y); + } + + x += sliceWidth; + } + this.canvasCtx.lineTo(this.canvas.width, this.canvas.height/2); + this.canvasCtx.stroke(); + } +} Index: openacs-4/packages/proctoring-support/www/resources/gif.js =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/resources/Attic/gif.js,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/resources/gif.js 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,3 @@ +// gif.js 0.2.0 - https://github.com/jnordberg/gif.js +(function(f){if(typeof exports==="object"&&typeof module!=="undefined"){module.exports=f()}else if(typeof define==="function"&&define.amd){define([],f)}else{var g;if(typeof window!=="undefined"){g=window}else if(typeof global!=="undefined"){g=global}else if(typeof self!=="undefined"){g=self}else{g=this}g.GIF=f()}})(function(){var define,module,exports;return function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o0&&this._events[type].length>m){this._events[type].warned=true;console.error("(node) warning: possible EventEmitter memory "+"leak detected. %d listeners added. "+"Use emitter.setMaxListeners() to increase limit.",this._events[type].length);if(typeof console.trace==="function"){console.trace()}}}return this};EventEmitter.prototype.on=EventEmitter.prototype.addListener;EventEmitter.prototype.once=function(type,listener){if(!isFunction(listener))throw TypeError("listener must be a function");var fired=false;function g(){this.removeListener(type,g);if(!fired){fired=true;listener.apply(this,arguments)}}g.listener=listener;this.on(type,g);return this};EventEmitter.prototype.removeListener=function(type,listener){var list,position,length,i;if(!isFunction(listener))throw TypeError("listener must be a function");if(!this._events||!this._events[type])return this;list=this._events[type];length=list.length;position=-1;if(list===listener||isFunction(list.listener)&&list.listener===listener){delete this._events[type];if(this._events.removeListener)this.emit("removeListener",type,listener)}else if(isObject(list)){for(i=length;i-- >0;){if(list[i]===listener||list[i].listener&&list[i].listener===listener){position=i;break}}if(position<0)return this;if(list.length===1){list.length=0;delete this._events[type]}else{list.splice(position,1)}if(this._events.removeListener)this.emit("removeListener",type,listener)}return this};EventEmitter.prototype.removeAllListeners=function(type){var key,listeners;if(!this._events)return this;if(!this._events.removeListener){if(arguments.length===0)this._events={};else if(this._events[type])delete this._events[type];return this}if(arguments.length===0){for(key in this._events){if(key==="removeListener")continue;this.removeAllListeners(key)}this.removeAllListeners("removeListener");this._events={};return this}listeners=this._events[type];if(isFunction(listeners)){this.removeListener(type,listeners)}else if(listeners){while(listeners.length)this.removeListener(type,listeners[listeners.length-1])}delete this._events[type];return this};EventEmitter.prototype.listeners=function(type){var ret;if(!this._events||!this._events[type])ret=[];else if(isFunction(this._events[type]))ret=[this._events[type]];else ret=this._events[type].slice();return ret};EventEmitter.prototype.listenerCount=function(type){if(this._events){var evlistener=this._events[type];if(isFunction(evlistener))return 1;else if(evlistener)return evlistener.length}return 0};EventEmitter.listenerCount=function(emitter,type){return emitter.listenerCount(type)};function isFunction(arg){return typeof arg==="function"}function isNumber(arg){return typeof arg==="number"}function isObject(arg){return typeof arg==="object"&&arg!==null}function isUndefined(arg){return arg===void 0}},{}],2:[function(require,module,exports){var UA,browser,mode,platform,ua;ua=navigator.userAgent.toLowerCase();platform=navigator.platform.toLowerCase();UA=ua.match(/(opera|ie|firefox|chrome|version)[\s\/:]([\w\d\.]+)?.*?(safari|version[\s\/:]([\w\d\.]+)|$)/)||[null,"unknown",0];mode=UA[1]==="ie"&&document.documentMode;browser={name:UA[1]==="version"?UA[3]:UA[1],version:mode||parseFloat(UA[1]==="opera"&&UA[4]?UA[4]:UA[2]),platform:{name:ua.match(/ip(?:ad|od|hone)/)?"ios":(ua.match(/(?:webos|android)/)||platform.match(/mac|win|linux/)||["other"])[0]}};browser[browser.name]=true;browser[browser.name+parseInt(browser.version,10)]=true;browser.platform[browser.platform.name]=true;module.exports=browser},{}],3:[function(require,module,exports){var EventEmitter,GIF,browser,extend=function(child,parent){for(var key in parent){if(hasProp.call(parent,key))child[key]=parent[key]}function ctor(){this.constructor=child}ctor.prototype=parent.prototype;child.prototype=new ctor;child.__super__=parent.prototype;return child},hasProp={}.hasOwnProperty,indexOf=[].indexOf||function(item){for(var i=0,l=this.length;iref;i=0<=ref?++j:--j){results.push(null)}return results}.call(this);numWorkers=this.spawnWorkers();if(this.options.globalPalette===true){this.renderNextFrame()}else{for(i=j=0,ref=numWorkers;0<=ref?jref;i=0<=ref?++j:--j){this.renderNextFrame()}}this.emit("start");return this.emit("progress",0)};GIF.prototype.abort=function(){var worker;while(true){worker=this.activeWorkers.shift();if(worker==null){break}this.log("killing active worker");worker.terminate()}this.running=false;return this.emit("abort")};GIF.prototype.spawnWorkers=function(){var j,numWorkers,ref,results;numWorkers=Math.min(this.options.workers,this.frames.length);(function(){results=[];for(var j=ref=this.freeWorkers.length;ref<=numWorkers?jnumWorkers;ref<=numWorkers?j++:j--){results.push(j)}return results}).apply(this).forEach(function(_this){return function(i){var worker;_this.log("spawning worker "+i);worker=new Worker(_this.options.workerScript);worker.onmessage=function(event){_this.activeWorkers.splice(_this.activeWorkers.indexOf(worker),1);_this.freeWorkers.push(worker);return _this.frameFinished(event.data)};return _this.freeWorkers.push(worker)}}(this));return numWorkers};GIF.prototype.frameFinished=function(frame){var i,j,ref;this.log("frame "+frame.index+" finished - "+this.activeWorkers.length+" active");this.finishedFrames++;this.emit("progress",this.finishedFrames/this.frames.length);this.imageParts[frame.index]=frame;if(this.options.globalPalette===true){this.options.globalPalette=frame.globalPalette;this.log("global palette analyzed");if(this.frames.length>2){for(i=j=1,ref=this.freeWorkers.length;1<=ref?jref;i=1<=ref?++j:--j){this.renderNextFrame()}}}if(indexOf.call(this.imageParts,null)>=0){return this.renderNextFrame()}else{return this.finishRendering()}};GIF.prototype.finishRendering=function(){var data,frame,i,image,j,k,l,len,len1,len2,len3,offset,page,ref,ref1,ref2;len=0;ref=this.imageParts;for(j=0,len1=ref.length;j=this.frames.length){return}frame=this.frames[this.nextFrame++];worker=this.freeWorkers.shift();task=this.getTask(frame);this.log("starting frame "+(task.index+1)+" of "+this.frames.length);this.activeWorkers.push(worker);return worker.postMessage(task)};GIF.prototype.getContextData=function(ctx){return ctx.getImageData(0,0,this.options.width,this.options.height).data};GIF.prototype.getImageData=function(image){var ctx;if(this._canvas==null){this._canvas=document.createElement("canvas");this._canvas.width=this.options.width;this._canvas.height=this.options.height}ctx=this._canvas.getContext("2d");ctx.setFill=this.options.background;ctx.fillRect(0,0,this.options.width,this.options.height);ctx.drawImage(image,0,0);return this.getContextData(ctx)};GIF.prototype.getTask=function(frame){var index,task;index=this.frames.indexOf(frame);task={index:index,last:index===this.frames.length-1,delay:frame.delay,transparent:frame.transparent,width:this.options.width,height:this.options.height,quality:this.options.quality,dither:this.options.dither,globalPalette:this.options.globalPalette,repeat:this.options.repeat,canTransfer:browser.name==="chrome"};if(frame.data!=null){task.data=frame.data}else if(frame.context!=null){task.data=this.getContextData(frame.context)}else if(frame.image!=null){task.data=this.getImageData(frame.image)}else{throw new Error("Invalid frame")}return task};GIF.prototype.log=function(){var args;args=1<=arguments.length?slice.call(arguments,0):[];if(!this.options.debug){return}return console.log.apply(console,args)};return GIF}(EventEmitter);module.exports=GIF},{"./browser.coffee":2,events:1}]},{},[3])(3)}); +//# sourceMappingURL=gif.js.map Index: openacs-4/packages/proctoring-support/www/resources/gif.js.map =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/resources/Attic/gif.js.map,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/resources/gif.js.map 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1 @@ +{"version":3,"sources":["node_modules/browser-pack/_prelude.js","node_modules/events/events.js","src/browser.coffee","src/gif.coffee"],"names":["f","exports","module","define","amd","g","window","global","self","this","GIF","e","t","n","r","s","o","u","a","require","i","Error","code","l","call","length","1","EventEmitter","_events","_maxListeners","undefined","prototype","defaultMaxListeners","setMaxListeners","isNumber","isNaN","TypeError","emit","type","er","handler","len","args","listeners","error","isObject","arguments","err","context","isUndefined","isFunction","Array","slice","apply","addListener","listener","m","newListener","push","warned","console","trace","on","once","fired","removeListener","list","position","splice","removeAllListeners","key","ret","listenerCount","evlistener","emitter","arg","UA","browser","mode","platform","ua","navigator","userAgent","toLowerCase","match","document","documentMode","name","version","parseFloat","parseInt","extend","child","parent","hasProp","ctor","constructor","__super__","superClass","defaults","frameDefaults","workerScript","workers","repeat","background","quality","width","height","transparent","debug","dither","delay","copy","options","base","value","running","frames","freeWorkers","activeWorkers","setOptions","setOption","_canvas","results","addFrame","image","frame","ImageData","data","CanvasRenderingContext2D","WebGLRenderingContext","getContextData","childNodes","getImageData","render","j","numWorkers","ref","nextFrame","finishedFrames","imageParts","spawnWorkers","globalPalette","renderNextFrame","abort","worker","shift","log","terminate","Math","min","forEach","_this","Worker","onmessage","event","indexOf","frameFinished","index","finishRendering","k","len1","len2","len3","offset","page","ref1","ref2","pageSize","cursor","round","Uint8Array","set","Blob","task","getTask","postMessage","ctx","createElement","getContext","setFill","fillRect","drawImage","last","canTransfer"],"mappings":";CAAA,SAAAA,GAAA,SAAAC,WAAA,gBAAAC,UAAA,YAAA,CAAAA,OAAAD,QAAAD,QAAA,UAAAG,UAAA,YAAAA,OAAAC,IAAA,CAAAD,UAAAH,OAAA,CAAA,GAAAK,EAAA,UAAAC,UAAA,YAAA,CAAAD,EAAAC,WAAA,UAAAC,UAAA,YAAA,CAAAF,EAAAE,WAAA,UAAAC,QAAA,YAAA,CAAAH,EAAAG,SAAA,CAAAH,EAAAI,KAAAJ,EAAAK,IAAAV,OAAA,WAAA,GAAAG,QAAAD,OAAAD,OAAA,OAAA,SAAAU,GAAAC,EAAAC,EAAAC,GAAA,QAAAC,GAAAC,EAAAC,GAAA,IAAAJ,EAAAG,GAAA,CAAA,IAAAJ,EAAAI,GAAA,CAAA,GAAAE,SAAAC,UAAA,YAAAA,OAAA,KAAAF,GAAAC,EAAA,MAAAA,GAAAF,GAAA,EAAA,IAAAI,EAAA,MAAAA,GAAAJ,GAAA,EAAA,IAAAhB,GAAA,GAAAqB,OAAA,uBAAAL,EAAA,IAAA,MAAAhB,GAAAsB,KAAA,mBAAAtB,EAAA,GAAAuB,GAAAV,EAAAG,IAAAf,WAAAW,GAAAI,GAAA,GAAAQ,KAAAD,EAAAtB,QAAA,SAAAU,GAAA,GAAAE,GAAAD,EAAAI,GAAA,GAAAL,EAAA,OAAAI,GAAAF,EAAAA,EAAAF,IAAAY,EAAAA,EAAAtB,QAAAU,EAAAC,EAAAC,EAAAC,GAAA,MAAAD,GAAAG,GAAAf,QAAA,GAAAmB,SAAAD,UAAA,YAAAA,OAAA,KAAA,GAAAH,GAAA,EAAAA,EAAAF,EAAAW,OAAAT,IAAAD,EAAAD,EAAAE,GAAA,OAAAD,KAAAW,GAAA,SAAAP,QAAAjB,OAAAD,SCqBA,QAAA0B,gBACAlB,KAAAmB,QAAAnB,KAAAmB,WACAnB,MAAAoB,cAAApB,KAAAoB,eAAAC,UAEA5B,OAAAD,QAAA0B,YAGAA,cAAAA,aAAAA,YAEAA,cAAAI,UAAAH,QAAAE,SACAH,cAAAI,UAAAF,cAAAC,SAIAH,cAAAK,oBAAA,EAIAL,cAAAI,UAAAE,gBAAA,SAAApB,GACA,IAAAqB,SAAArB,IAAAA,EAAA,GAAAsB,MAAAtB,GACA,KAAAuB,WAAA,8BACA3B,MAAAoB,cAAAhB,CACA,OAAAJ,MAGAkB,cAAAI,UAAAM,KAAA,SAAAC,MACA,GAAAC,IAAAC,QAAAC,IAAAC,KAAAtB,EAAAuB,SAEA,KAAAlC,KAAAmB,QACAnB,KAAAmB,UAGA,IAAAU,OAAA,QAAA,CACA,IAAA7B,KAAAmB,QAAAgB,OACAC,SAAApC,KAAAmB,QAAAgB,SAAAnC,KAAAmB,QAAAgB,MAAAnB,OAAA,CACAc,GAAAO,UAAA,EACA,IAAAP,aAAAlB,OAAA,CACA,KAAAkB,QACA,CAEA,GAAAQ,KAAA,GAAA1B,OAAA,yCAAAkB,GAAA,IACAQ,KAAAC,QAAAT,EACA,MAAAQ,OAKAP,QAAA/B,KAAAmB,QAAAU,KAEA,IAAAW,YAAAT,SACA,MAAA,MAEA,IAAAU,WAAAV,SAAA,CACA,OAAAM,UAAArB,QAEA,IAAA,GACAe,QAAAhB,KAAAf,KACA,MACA,KAAA,GACA+B,QAAAhB,KAAAf,KAAAqC,UAAA,GACA,MACA,KAAA,GACAN,QAAAhB,KAAAf,KAAAqC,UAAA,GAAAA,UAAA,GACA,MAEA,SACAJ,KAAAS,MAAApB,UAAAqB,MAAA5B,KAAAsB,UAAA,EACAN,SAAAa,MAAA5C,KAAAiC,WAEA,IAAAG,SAAAL,SAAA,CACAE,KAAAS,MAAApB,UAAAqB,MAAA5B,KAAAsB,UAAA,EACAH,WAAAH,QAAAY,OACAX,KAAAE,UAAAlB,MACA,KAAAL,EAAA,EAAAA,EAAAqB,IAAArB,IACAuB,UAAAvB,GAAAiC,MAAA5C,KAAAiC,MAGA,MAAA,MAGAf,cAAAI,UAAAuB,YAAA,SAAAhB,KAAAiB,UACA,GAAAC,EAEA,KAAAN,WAAAK,UACA,KAAAnB,WAAA,8BAEA,KAAA3B,KAAAmB,QACAnB,KAAAmB,UAIA,IAAAnB,KAAAmB,QAAA6B,YACAhD,KAAA4B,KAAA,cAAAC,KACAY,WAAAK,SAAAA,UACAA,SAAAA,SAAAA,SAEA,KAAA9C,KAAAmB,QAAAU,MAEA7B,KAAAmB,QAAAU,MAAAiB,aACA,IAAAV,SAAApC,KAAAmB,QAAAU,OAEA7B,KAAAmB,QAAAU,MAAAoB,KAAAH,cAGA9C,MAAAmB,QAAAU,OAAA7B,KAAAmB,QAAAU,MAAAiB,SAGA,IAAAV,SAAApC,KAAAmB,QAAAU,SAAA7B,KAAAmB,QAAAU,MAAAqB,OAAA,CACA,IAAAV,YAAAxC,KAAAoB,eAAA,CACA2B,EAAA/C,KAAAoB,kBACA,CACA2B,EAAA7B,aAAAK,oBAGA,GAAAwB,GAAAA,EAAA,GAAA/C,KAAAmB,QAAAU,MAAAb,OAAA+B,EAAA,CACA/C,KAAAmB,QAAAU,MAAAqB,OAAA,IACAC,SAAAhB,MAAA,gDACA,sCACA,mDACAnC,KAAAmB,QAAAU,MAAAb,OACA,UAAAmC,SAAAC,QAAA,WAAA,CAEAD,QAAAC,UAKA,MAAApD,MAGAkB,cAAAI,UAAA+B,GAAAnC,aAAAI,UAAAuB,WAEA3B,cAAAI,UAAAgC,KAAA,SAAAzB,KAAAiB,UACA,IAAAL,WAAAK,UACA,KAAAnB,WAAA,8BAEA,IAAA4B,OAAA,KAEA,SAAA3D,KACAI,KAAAwD,eAAA3B,KAAAjC,EAEA,KAAA2D,MAAA,CACAA,MAAA,IACAT,UAAAF,MAAA5C,KAAAqC,YAIAzC,EAAAkD,SAAAA,QACA9C,MAAAqD,GAAAxB,KAAAjC,EAEA,OAAAI,MAIAkB,cAAAI,UAAAkC,eAAA,SAAA3B,KAAAiB,UACA,GAAAW,MAAAC,SAAA1C,OAAAL,CAEA,KAAA8B,WAAAK,UACA,KAAAnB,WAAA,8BAEA,KAAA3B,KAAAmB,UAAAnB,KAAAmB,QAAAU,MACA,MAAA7B,KAEAyD,MAAAzD,KAAAmB,QAAAU,KACAb,QAAAyC,KAAAzC,MACA0C,WAAA,CAEA,IAAAD,OAAAX,UACAL,WAAAgB,KAAAX,WAAAW,KAAAX,WAAAA,SAAA,OACA9C,MAAAmB,QAAAU,KACA,IAAA7B,KAAAmB,QAAAqC,eACAxD,KAAA4B,KAAA,iBAAAC,KAAAiB,cAEA,IAAAV,SAAAqB,MAAA,CACA,IAAA9C,EAAAK,OAAAL,KAAA,GAAA,CACA,GAAA8C,KAAA9C,KAAAmC,UACAW,KAAA9C,GAAAmC,UAAAW,KAAA9C,GAAAmC,WAAAA,SAAA,CACAY,SAAA/C,CACA,QAIA,GAAA+C,SAAA,EACA,MAAA1D,KAEA,IAAAyD,KAAAzC,SAAA,EAAA,CACAyC,KAAAzC,OAAA,QACAhB,MAAAmB,QAAAU,UACA,CACA4B,KAAAE,OAAAD,SAAA,GAGA,GAAA1D,KAAAmB,QAAAqC,eACAxD,KAAA4B,KAAA,iBAAAC,KAAAiB,UAGA,MAAA9C,MAGAkB,cAAAI,UAAAsC,mBAAA,SAAA/B,MACA,GAAAgC,KAAA3B,SAEA,KAAAlC,KAAAmB,QACA,MAAAnB,KAGA,KAAAA,KAAAmB,QAAAqC,eAAA,CACA,GAAAnB,UAAArB,SAAA,EACAhB,KAAAmB,eACA,IAAAnB,KAAAmB,QAAAU,YACA7B,MAAAmB,QAAAU,KACA,OAAA7B,MAIA,GAAAqC,UAAArB,SAAA,EAAA,CACA,IAAA6C,MAAA7D,MAAAmB,QAAA,CACA,GAAA0C,MAAA,iBAAA,QACA7D,MAAA4D,mBAAAC,KAEA7D,KAAA4D,mBAAA,iBACA5D,MAAAmB,UACA,OAAAnB,MAGAkC,UAAAlC,KAAAmB,QAAAU,KAEA,IAAAY,WAAAP,WAAA,CACAlC,KAAAwD,eAAA3B,KAAAK,eACA,IAAAA,UAAA,CAEA,MAAAA,UAAAlB,OACAhB,KAAAwD,eAAA3B,KAAAK,UAAAA,UAAAlB,OAAA,UAEAhB,MAAAmB,QAAAU,KAEA,OAAA7B,MAGAkB,cAAAI,UAAAY,UAAA,SAAAL,MACA,GAAAiC,IACA,KAAA9D,KAAAmB,UAAAnB,KAAAmB,QAAAU,MACAiC,WACA,IAAArB,WAAAzC,KAAAmB,QAAAU,OACAiC,KAAA9D,KAAAmB,QAAAU,WAEAiC,KAAA9D,KAAAmB,QAAAU,MAAAc,OACA,OAAAmB,KAGA5C,cAAAI,UAAAyC,cAAA,SAAAlC,MACA,GAAA7B,KAAAmB,QAAA,CACA,GAAA6C,YAAAhE,KAAAmB,QAAAU,KAEA,IAAAY,WAAAuB,YACA,MAAA,OACA,IAAAA,WACA,MAAAA,YAAAhD,OAEA,MAAA,GAGAE,cAAA6C,cAAA,SAAAE,QAAApC,MACA,MAAAoC,SAAAF,cAAAlC,MAGA,SAAAY,YAAAyB,KACA,aAAAA,OAAA,WAGA,QAAAzC,UAAAyC,KACA,aAAAA,OAAA,SAGA,QAAA9B,UAAA8B,KACA,aAAAA,OAAA,UAAAA,MAAA,KAGA,QAAA1B,aAAA0B,KACA,MAAAA,WAAA,6CC5SA,GAAAC,IAAAC,QAAAC,KAAAC,SAAAC,EAEAA,IAAKC,UAAUC,UAAUC,aACzBJ,UAAWE,UAAUF,SAASI,aAC9BP,IAAKI,GAAGI,MAAM,iGAAmG,KAAM,UAAW,EAClIN,MAAOF,GAAG,KAAM,MAAQS,SAASC,YAEjCT,UACEU,KAASX,GAAG,KAAM,UAAeA,GAAG,GAAQA,GAAG,GAC/CY,QAASV,MAAQW,WAAcb,GAAG,KAAM,SAAWA,GAAG,GAAQA,GAAG,GAAQA,GAAG,IAE5EG,UACEQ,KAASP,GAAGI,MAAM,oBAAyB,OAAYJ,GAAGI,MAAM,sBAAwBL,SAASK,MAAM,mBAAqB,UAAU,IAE1IP,SAAQA,QAAQU,MAAQ,IACxBV,SAAQA,QAAQU,KAAOG,SAASb,QAAQW,QAAS,KAAO,IACxDX,SAAQE,SAASF,QAAQE,SAASQ,MAAQ,IAE1CrF,QAAOD,QAAU4E,iDClBjB,GAAAlD,cAAAjB,IAAAmE,QAAAc,OAAA,SAAAC,MAAAC,QAAA,IAAA,GAAAvB,OAAAuB,QAAA,CAAA,GAAAC,QAAAtE,KAAAqE,OAAAvB,KAAAsB,MAAAtB,KAAAuB,OAAAvB,KAAA,QAAAyB,QAAAtF,KAAAuF,YAAAJ,MAAAG,KAAAhE,UAAA8D,OAAA9D,SAAA6D,OAAA7D,UAAA,GAAAgE,KAAAH,OAAAK,UAAAJ,OAAA9D,SAAA,OAAA6D,sKAACjE,cAAgBR,QAAQ,UAARQ,YACjBkD,SAAU1D,QAAQ,mBAEZT,KAAA,SAAAwF,YAEJ,GAAAC,UAAAC,oCAAAD,WACEE,aAAc,gBACdC,QAAS,EACTC,OAAQ,EACRC,WAAY,OACZC,QAAS,GACTC,MAAO,KACPC,OAAQ,KACRC,YAAa,KACbC,MAAO,MACPC,OAAQ,MAEVV,gBACEW,MAAO,IACPC,KAAM,MAEK,SAAAtG,KAACuG,SACZ,GAAAC,MAAA5C,IAAA6C,KAAA1G,MAAC2G,QAAU,KAEX3G,MAACwG,UACDxG,MAAC4G,SAED5G,MAAC6G,cACD7G,MAAC8G,gBAED9G,MAAC+G,WAAWP,QACZ,KAAA3C,MAAA6B,UAAA,6DACW7B,KAAQ6C,sBAErBM,UAAW,SAACnD,IAAK6C,OACf1G,KAACwG,QAAQ3C,KAAO6C,KAChB,IAAG1G,KAAAiH,SAAA,OAAcpD,MAAQ,SAARA,MAAiB,UAAlC,OACE7D,MAACiH,QAAQpD,KAAO6C,sBAEpBK,WAAY,SAACP,SACX,GAAA3C,KAAAqD,QAAAR,KAAAQ,gBAAArD,MAAA2C,SAAA,wEAAAxG,KAACgH,UAAUnD,IAAK6C,sCAElBS,SAAU,SAACC,MAAOZ,SAChB,GAAAa,OAAAxD,sBADgB2C,WAChBa,QACAA,OAAMlB,YAAcnG,KAACwG,QAAQL,WAC7B,KAAAtC,MAAA8B,eAAA,CACE0B,MAAMxD,KAAO2C,QAAQ3C,MAAQ8B,cAAc9B,KAG7C,GAAuC7D,KAAAwG,QAAAP,OAAA,KAAvC,CAAAjG,KAACgH,UAAU,QAASI,MAAMnB,OAC1B,GAAyCjG,KAAAwG,QAAAN,QAAA,KAAzC,CAAAlG,KAACgH,UAAU,SAAUI,MAAMlB,QAE3B,SAAGoB,aAAA,aAAAA,YAAA,MAAeF,gBAAiBE,WAAnC,CACGD,MAAME,KAAOH,MAAMG,SACjB,UAAIC,4BAAA,aAAAA,2BAAA,MAA8BJ,gBAAiBI,iCAA8BC,yBAAA,aAAAA,wBAAA,MAA2BL,gBAAiBK,uBAA7H,CACH,GAAGjB,QAAQD,KAAX,CACEc,MAAME,KAAOvH,KAAC0H,eAAeN,WAD/B,CAGEC,MAAM9E,QAAU6E,WACf,IAAGA,MAAAO,YAAA,KAAH,CACH,GAAGnB,QAAQD,KAAX,CACEc,MAAME,KAAOvH,KAAC4H,aAAaR,WAD7B,CAGEC,MAAMD,MAAQA,WAJb,CAMH,KAAU,IAAAxG,OAAM,uBAElBZ,MAAC4G,OAAO3D,KAAKoE,sBAEfQ,OAAQ,WACN,GAAAlH,GAAAmH,EAAAC,WAAAC,GAAA,IAAqChI,KAAC2G,QAAtC,CAAA,KAAU,IAAA/F,OAAM,mBAEhB,GAAOZ,KAAAwG,QAAAP,OAAA,MAAuBjG,KAAAwG,QAAAN,QAAA,KAA9B,CACE,KAAU,IAAAtF,OAAM,mDAElBZ,KAAC2G,QAAU,IACX3G,MAACiI,UAAY,CACbjI,MAACkI,eAAiB,CAElBlI,MAACmI,WAAD,4BAAejB,gBAAcvG,EAAAmH,EAAA,EAAAE,IAAAhI,KAAA4G,OAAA5F,OAAA,GAAAgH,IAAAF,EAAAE,IAAAF,EAAAE,IAAArH,EAAA,GAAAqH,MAAAF,IAAAA,EAAd,cAAA,gCACfC,YAAa/H,KAACoI,cAEd,IAAGpI,KAACwG,QAAQ6B,gBAAiB,KAA7B,CACErI,KAACsI,sBADH,CAGE,IAA4B3H,EAAAmH,EAAA,EAAAE,IAAAD,WAAA,GAAAC,IAAAF,EAAAE,IAAAF,EAAAE,IAAArH,EAAA,GAAAqH,MAAAF,IAAAA,EAA5B,CAAA9H,KAACsI,mBAEHtI,KAAC4B,KAAK,eACN5B,MAAC4B,KAAK,WAAY,kBAEpB2G,MAAO,WACL,GAAAC,OAAA,OAAA,KAAA,CACEA,OAASxI,KAAC8G,cAAc2B,OACxB,IAAaD,QAAA,KAAb,CAAA,MACAxI,KAAC0I,IAAI,wBACLF,QAAOG,YACT3I,KAAC2G,QAAU,YACX3G,MAAC4B,KAAK,wBAIRwG,aAAc,WACZ,GAAAN,GAAAC,WAAAC,IAAAd,OAAAa,YAAaa,KAAKC,IAAI7I,KAACwG,QAAQX,QAAS7F,KAAC4G,OAAO5F,SAChD,4KAAmC8H,QAAQ,SAAAC,aAAA,UAACpI,GAC1C,GAAA6H,OAAAO,OAACL,IAAI,mBAAoB/H,EACzB6H,QAAa,GAAAQ,QAAOD,MAACvC,QAAQZ,aAC7B4C,QAAOS,UAAY,SAACC,OAClBH,MAACjC,cAAcnD,OAAOoF,MAACjC,cAAcqC,QAAQX,QAAS,EACtDO,OAAClC,YAAY5D,KAAKuF,cAClBO,OAACK,cAAcF,MAAM3B,aACvBwB,OAAClC,YAAY5D,KAAKuF,UAPuBxI,MAQ3C,OAAO+H,2BAETqB,cAAe,SAAC/B,OACd,GAAA1G,GAAAmH,EAAAE,GAAAhI,MAAC0I,IAAI,SAAUrB,MAAMgC,MAAO,eAAerJ,KAAC8G,cAAc9F,OAAQ,UAClEhB,MAACkI,gBACDlI,MAAC4B,KAAK,WAAY5B,KAACkI,eAAiBlI,KAAC4G,OAAO5F,OAC5ChB,MAACmI,WAAWd,MAAMgC,OAAShC,KAE3B,IAAGrH,KAACwG,QAAQ6B,gBAAiB,KAA7B,CACErI,KAACwG,QAAQ6B,cAAgBhB,MAAMgB,aAC/BrI,MAAC0I,IAAI,0BACL,IAAyD1I,KAAC4G,OAAO5F,OAAS,EAA1E,CAAA,IAA4BL,EAAAmH,EAAA,EAAAE,IAAAhI,KAAA6G,YAAA7F,OAAA,GAAAgH,IAAAF,EAAAE,IAAAF,EAAAE,IAAArH,EAAA,GAAAqH,MAAAF,IAAAA,EAA5B,CAAA9H,KAACsI,oBACH,GAAGa,QAAApI,KAAQf,KAACmI,WAAT,OAAA,EAAH,OACEnI,MAACsI,sBADH,OAGEtI,MAACsJ,kCAELA,gBAAiB,WACf,GAAA/B,MAAAF,MAAA1G,EAAAyG,MAAAU,EAAAyB,EAAAzI,EAAAkB,IAAAwH,KAAAC,KAAAC,KAAAC,OAAAC,KAAA5B,IAAA6B,KAAAC,IAAA9H,KAAM,CACNgG,KAAAhI,KAAAmI,UAAA,KAAAL,EAAA,EAAA0B,KAAAxB,IAAAhH,OAAA8G,EAAA0B,KAAA1B,IAAA,aACE9F,OAAQqF,MAAME,KAAKvG,OAAS,GAAKqG,MAAM0C,SAAW1C,MAAM2C,OAC1DhI,KAAOqF,MAAM0C,SAAW1C,MAAM2C,MAC9BhK,MAAC0I,IAAI,iCAAkCE,KAAKqB,MAAMjI,IAAM,KAAO,KAC/DuF,MAAW,GAAA2C,YAAWlI,IACtB2H,QAAS,CACTE,MAAA7J,KAAAmI,UAAA,KAAAoB,EAAA,EAAAE,KAAAI,KAAA7I,OAAAuI,EAAAE,KAAAF,IAAA,cACEO,MAAAzC,MAAAE,IAAA,KAAA5G,EAAAG,EAAA,EAAA4I,KAAAI,KAAA9I,OAAAF,EAAA4I,KAAA/I,IAAAG,EAAA,aACEyG,MAAK4C,IAAIP,KAAMD,OACf,IAAGhJ,IAAK0G,MAAME,KAAKvG,OAAS,EAA5B,CACE2I,QAAUtC,MAAM2C,WADlB,CAGEL,QAAUtC,MAAM0C,WAEtB3C,MAAY,GAAAgD,OAAM7C,OAChB1F,KAAM,oBAER7B,MAAC4B,KAAK,WAAYwF,MAAOG,qBAE3Be,gBAAiB,WACf,GAAAjB,OAAAgD,KAAA7B,MAAA,IAAqCxI,KAAC6G,YAAY7F,SAAU,EAA5D,CAAA,KAAU,IAAAJ,OAAM,mBAChB,GAAUZ,KAACiI,WAAajI,KAAC4G,OAAO5F,OAAhC,CAAA,OAEAqG,MAAQrH,KAAC4G,OAAO5G,KAACiI,YACjBO,QAASxI,KAAC6G,YAAY4B,OACtB4B,MAAOrK,KAACsK,QAAQjD,MAEhBrH,MAAC0I,IAAI,mBAAmB2B,KAAKhB,MAAQ,GAAG,OAAOrJ,KAAC4G,OAAO5F,OACvDhB,MAAC8G,cAAc7D,KAAKuF,cACpBA,QAAO+B,YAAYF,qBAErB3C,eAAgB,SAAC8C,KACf,MAAOA,KAAI5C,aAAa,EAAG,EAAG5H,KAACwG,QAAQP,MAAOjG,KAACwG,QAAQN,QAAQqB,oBAEjEK,aAAc,SAACR,OACb,GAAAoD,IAAA,IAAOxK,KAAAiH,SAAA,KAAP,CACEjH,KAACiH,QAAUrC,SAAS6F,cAAc,SAClCzK,MAACiH,QAAQhB,MAAQjG,KAACwG,QAAQP,KAC1BjG,MAACiH,QAAQf,OAASlG,KAACwG,QAAQN,OAE7BsE,IAAMxK,KAACiH,QAAQyD,WAAW,KAC1BF,KAAIG,QAAU3K,KAACwG,QAAQT,UACvByE,KAAII,SAAS,EAAG,EAAG5K,KAACwG,QAAQP,MAAOjG,KAACwG,QAAQN,OAC5CsE,KAAIK,UAAUzD,MAAO,EAAG,EAExB,OAAOpH,MAAC0H,eAAe8C,oBAEzBF,QAAS,SAACjD,OACR,GAAAgC,OAAAgB,IAAAhB,OAAQrJ,KAAC4G,OAAOuC,QAAQ9B,MACxBgD,OACEhB,MAAOA,MACPyB,KAAMzB,QAAUrJ,KAAC4G,OAAO5F,OAAS,EACjCsF,MAAOe,MAAMf,MACbH,YAAakB,MAAMlB,YACnBF,MAAOjG,KAACwG,QAAQP,MAChBC,OAAQlG,KAACwG,QAAQN,OACjBF,QAAShG,KAACwG,QAAQR,QAClBK,OAAQrG,KAACwG,QAAQH,OACjBgC,cAAerI,KAACwG,QAAQ6B,cACxBvC,OAAQ9F,KAACwG,QAAQV,OACjBiF,YAAc3G,QAAQU,OAAQ,SAEhC,IAAGuC,MAAAE,MAAA,KAAH,CACE8C,KAAK9C,KAAOF,MAAME,SACf,IAAGF,MAAA9E,SAAA,KAAH,CACH8H,KAAK9C,KAAOvH,KAAC0H,eAAeL,MAAM9E,aAC/B,IAAG8E,MAAAD,OAAA,KAAH,CACHiD,KAAK9C,KAAOvH,KAAC4H,aAAaP,MAAMD,WAD7B,CAGH,KAAU,IAAAxG,OAAM,iBAElB,MAAOyJ,qBAET3B,IAAK,WACH,GAAAzG,KADIA,MAAA,GAAAI,UAAArB,OAAA2B,MAAA5B,KAAAsB,UAAA,KACJ,KAAcrC,KAACwG,QAAQJ,MAAvB,CAAA,aACAjD,SAAQuF,IAAR9F,MAAAO,QAAYlB,mBA1MEf,aA6MlBzB,QAAOD,QAAUS","sourceRoot":"","sourcesContent":["(function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require==\"function\"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error(\"Cannot find module '\"+o+\"'\");throw f.code=\"MODULE_NOT_FOUND\",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require==\"function\"&&require;for(var o=0;o 0 && this._events[type].length > m) {\n this._events[type].warned = true;\n console.error('(node) warning: possible EventEmitter memory ' +\n 'leak detected. %d listeners added. ' +\n 'Use emitter.setMaxListeners() to increase limit.',\n this._events[type].length);\n if (typeof console.trace === 'function') {\n // not supported in IE 10\n console.trace();\n }\n }\n }\n\n return this;\n};\n\nEventEmitter.prototype.on = EventEmitter.prototype.addListener;\n\nEventEmitter.prototype.once = function(type, listener) {\n if (!isFunction(listener))\n throw TypeError('listener must be a function');\n\n var fired = false;\n\n function g() {\n this.removeListener(type, g);\n\n if (!fired) {\n fired = true;\n listener.apply(this, arguments);\n }\n }\n\n g.listener = listener;\n this.on(type, g);\n\n return this;\n};\n\n// emits a 'removeListener' event iff the listener was removed\nEventEmitter.prototype.removeListener = function(type, listener) {\n var list, position, length, i;\n\n if (!isFunction(listener))\n throw TypeError('listener must be a function');\n\n if (!this._events || !this._events[type])\n return this;\n\n list = this._events[type];\n length = list.length;\n position = -1;\n\n if (list === listener ||\n (isFunction(list.listener) && list.listener === listener)) {\n delete this._events[type];\n if (this._events.removeListener)\n this.emit('removeListener', type, listener);\n\n } else if (isObject(list)) {\n for (i = length; i-- > 0;) {\n if (list[i] === listener ||\n (list[i].listener && list[i].listener === listener)) {\n position = i;\n break;\n }\n }\n\n if (position < 0)\n return this;\n\n if (list.length === 1) {\n list.length = 0;\n delete this._events[type];\n } else {\n list.splice(position, 1);\n }\n\n if (this._events.removeListener)\n this.emit('removeListener', type, listener);\n }\n\n return this;\n};\n\nEventEmitter.prototype.removeAllListeners = function(type) {\n var key, listeners;\n\n if (!this._events)\n return this;\n\n // not listening for removeListener, no need to emit\n if (!this._events.removeListener) {\n if (arguments.length === 0)\n this._events = {};\n else if (this._events[type])\n delete this._events[type];\n return this;\n }\n\n // emit removeListener for all listeners on all events\n if (arguments.length === 0) {\n for (key in this._events) {\n if (key === 'removeListener') continue;\n this.removeAllListeners(key);\n }\n this.removeAllListeners('removeListener');\n this._events = {};\n return this;\n }\n\n listeners = this._events[type];\n\n if (isFunction(listeners)) {\n this.removeListener(type, listeners);\n } else if (listeners) {\n // LIFO order\n while (listeners.length)\n this.removeListener(type, listeners[listeners.length - 1]);\n }\n delete this._events[type];\n\n return this;\n};\n\nEventEmitter.prototype.listeners = function(type) {\n var ret;\n if (!this._events || !this._events[type])\n ret = [];\n else if (isFunction(this._events[type]))\n ret = [this._events[type]];\n else\n ret = this._events[type].slice();\n return ret;\n};\n\nEventEmitter.prototype.listenerCount = function(type) {\n if (this._events) {\n var evlistener = this._events[type];\n\n if (isFunction(evlistener))\n return 1;\n else if (evlistener)\n return evlistener.length;\n }\n return 0;\n};\n\nEventEmitter.listenerCount = function(emitter, type) {\n return emitter.listenerCount(type);\n};\n\nfunction isFunction(arg) {\n return typeof arg === 'function';\n}\n\nfunction isNumber(arg) {\n return typeof arg === 'number';\n}\n\nfunction isObject(arg) {\n return typeof arg === 'object' && arg !== null;\n}\n\nfunction isUndefined(arg) {\n return arg === void 0;\n}\n","### CoffeeScript version of the browser detection from MooTools ###\n\nua = navigator.userAgent.toLowerCase()\nplatform = navigator.platform.toLowerCase()\nUA = ua.match(/(opera|ie|firefox|chrome|version)[\\s\\/:]([\\w\\d\\.]+)?.*?(safari|version[\\s\\/:]([\\w\\d\\.]+)|$)/) or [null, 'unknown', 0]\nmode = UA[1] == 'ie' && document.documentMode\n\nbrowser =\n name: if UA[1] is 'version' then UA[3] else UA[1]\n version: mode or parseFloat(if UA[1] is 'opera' && UA[4] then UA[4] else UA[2])\n\n platform:\n name: if ua.match(/ip(?:ad|od|hone)/) then 'ios' else (ua.match(/(?:webos|android)/) or platform.match(/mac|win|linux/) or ['other'])[0]\n\nbrowser[browser.name] = true\nbrowser[browser.name + parseInt(browser.version, 10)] = true\nbrowser.platform[browser.platform.name] = true\n\nmodule.exports = browser\n","{EventEmitter} = require 'events'\nbrowser = require './browser.coffee'\n\nclass GIF extends EventEmitter\n\n defaults =\n workerScript: 'gif.worker.js'\n workers: 2\n repeat: 0 # repeat forever, -1 = repeat once\n background: '#fff'\n quality: 10 # pixel sample interval, lower is better\n width: null # size derermined from first frame if possible\n height: null\n transparent: null\n debug: false\n dither: false # see GIFEncoder.js for dithering options\n\n frameDefaults =\n delay: 500 # ms\n copy: false\n\n constructor: (options) ->\n @running = false\n\n @options = {}\n @frames = []\n\n @freeWorkers = []\n @activeWorkers = []\n\n @setOptions options\n for key, value of defaults\n @options[key] ?= value\n\n setOption: (key, value) ->\n @options[key] = value\n if @_canvas? and key in ['width', 'height']\n @_canvas[key] = value\n\n setOptions: (options) ->\n @setOption key, value for own key, value of options\n\n addFrame: (image, options={}) ->\n frame = {}\n frame.transparent = @options.transparent\n for key of frameDefaults\n frame[key] = options[key] or frameDefaults[key]\n\n # use the images width and height for options unless already set\n @setOption 'width', image.width unless @options.width?\n @setOption 'height', image.height unless @options.height?\n\n if ImageData? and image instanceof ImageData\n frame.data = image.data\n else if (CanvasRenderingContext2D? and image instanceof CanvasRenderingContext2D) or (WebGLRenderingContext? and image instanceof WebGLRenderingContext)\n if options.copy\n frame.data = @getContextData image\n else\n frame.context = image\n else if image.childNodes?\n if options.copy\n frame.data = @getImageData image\n else\n frame.image = image\n else\n throw new Error 'Invalid image'\n\n @frames.push frame\n\n render: ->\n throw new Error 'Already running' if @running\n\n if not @options.width? or not @options.height?\n throw new Error 'Width and height must be set prior to rendering'\n\n @running = true\n @nextFrame = 0\n @finishedFrames = 0\n\n @imageParts = (null for i in [0...@frames.length])\n numWorkers = @spawnWorkers()\n # we need to wait for the palette\n if @options.globalPalette == true\n @renderNextFrame()\n else\n @renderNextFrame() for i in [0...numWorkers]\n\n @emit 'start'\n @emit 'progress', 0\n\n abort: ->\n loop\n worker = @activeWorkers.shift()\n break unless worker?\n @log 'killing active worker'\n worker.terminate()\n @running = false\n @emit 'abort'\n\n # private\n\n spawnWorkers: ->\n numWorkers = Math.min(@options.workers, @frames.length)\n [@freeWorkers.length...numWorkers].forEach (i) =>\n @log \"spawning worker #{ i }\"\n worker = new Worker @options.workerScript\n worker.onmessage = (event) =>\n @activeWorkers.splice @activeWorkers.indexOf(worker), 1\n @freeWorkers.push worker\n @frameFinished event.data\n @freeWorkers.push worker\n return numWorkers\n\n frameFinished: (frame) ->\n @log \"frame #{ frame.index } finished - #{ @activeWorkers.length } active\"\n @finishedFrames++\n @emit 'progress', @finishedFrames / @frames.length\n @imageParts[frame.index] = frame\n # remember calculated palette, spawn the rest of the workers\n if @options.globalPalette == true\n @options.globalPalette = frame.globalPalette\n @log 'global palette analyzed'\n @renderNextFrame() for i in [1...@freeWorkers.length] if @frames.length > 2\n if null in @imageParts\n @renderNextFrame()\n else\n @finishRendering()\n\n finishRendering: ->\n len = 0\n for frame in @imageParts\n len += (frame.data.length - 1) * frame.pageSize + frame.cursor\n len += frame.pageSize - frame.cursor\n @log \"rendering finished - filesize #{ Math.round(len / 1000) }kb\"\n data = new Uint8Array len\n offset = 0\n for frame in @imageParts\n for page, i in frame.data\n data.set page, offset\n if i is frame.data.length - 1\n offset += frame.cursor\n else\n offset += frame.pageSize\n\n image = new Blob [data],\n type: 'image/gif'\n\n @emit 'finished', image, data\n\n renderNextFrame: ->\n throw new Error 'No free workers' if @freeWorkers.length is 0\n return if @nextFrame >= @frames.length # no new frame to render\n\n frame = @frames[@nextFrame++]\n worker = @freeWorkers.shift()\n task = @getTask frame\n\n @log \"starting frame #{ task.index + 1 } of #{ @frames.length }\"\n @activeWorkers.push worker\n worker.postMessage task#, [task.data.buffer]\n\n getContextData: (ctx) ->\n return ctx.getImageData(0, 0, @options.width, @options.height).data\n\n getImageData: (image) ->\n if not @_canvas?\n @_canvas = document.createElement 'canvas'\n @_canvas.width = @options.width\n @_canvas.height = @options.height\n\n ctx = @_canvas.getContext '2d'\n ctx.setFill = @options.background\n ctx.fillRect 0, 0, @options.width, @options.height\n ctx.drawImage image, 0, 0\n\n return @getContextData ctx\n\n getTask: (frame) ->\n index = @frames.indexOf frame\n task =\n index: index\n last: index is (@frames.length - 1)\n delay: frame.delay\n transparent: frame.transparent\n width: @options.width\n height: @options.height\n quality: @options.quality\n dither: @options.dither\n globalPalette: @options.globalPalette\n repeat: @options.repeat\n canTransfer: (browser.name is 'chrome')\n\n if frame.data?\n task.data = frame.data\n else if frame.context?\n task.data = @getContextData frame.context\n else if frame.image?\n task.data = @getImageData frame.image\n else\n throw new Error 'Invalid frame'\n\n return task\n\n log: (args...) ->\n return unless @options.debug\n console.log args...\n\n\nmodule.exports = GIF\n"]} \ No newline at end of file Index: openacs-4/packages/proctoring-support/www/resources/gif.worker.js =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/resources/Attic/gif.worker.js,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/resources/gif.worker.js 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,3 @@ +// gif.worker.js 0.2.0 - https://github.com/jnordberg/gif.js +(function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o=ByteArray.pageSize)this.newPage();this.pages[this.page][this.cursor++]=val};ByteArray.prototype.writeUTFBytes=function(string){for(var l=string.length,i=0;i=0)this.dispose=disposalCode};GIFEncoder.prototype.setRepeat=function(repeat){this.repeat=repeat};GIFEncoder.prototype.setTransparent=function(color){this.transparent=color};GIFEncoder.prototype.addFrame=function(imageData){this.image=imageData;this.colorTab=this.globalPalette&&this.globalPalette.slice?this.globalPalette:null;this.getImagePixels();this.analyzePixels();if(this.globalPalette===true)this.globalPalette=this.colorTab;if(this.firstFrame){this.writeLSD();this.writePalette();if(this.repeat>=0){this.writeNetscapeExt()}}this.writeGraphicCtrlExt();this.writeImageDesc();if(!this.firstFrame&&!this.globalPalette)this.writePalette();this.writePixels();this.firstFrame=false};GIFEncoder.prototype.finish=function(){this.out.writeByte(59)};GIFEncoder.prototype.setQuality=function(quality){if(quality<1)quality=1;this.sample=quality};GIFEncoder.prototype.setDither=function(dither){if(dither===true)dither="FloydSteinberg";this.dither=dither};GIFEncoder.prototype.setGlobalPalette=function(palette){this.globalPalette=palette};GIFEncoder.prototype.getGlobalPalette=function(){return this.globalPalette&&this.globalPalette.slice&&this.globalPalette.slice(0)||this.globalPalette};GIFEncoder.prototype.writeHeader=function(){this.out.writeUTFBytes("GIF89a")};GIFEncoder.prototype.analyzePixels=function(){if(!this.colorTab){this.neuQuant=new NeuQuant(this.pixels,this.sample);this.neuQuant.buildColormap();this.colorTab=this.neuQuant.getColormap()}if(this.dither){this.ditherPixels(this.dither.replace("-serpentine",""),this.dither.match(/-serpentine/)!==null)}else{this.indexPixels()}this.pixels=null;this.colorDepth=8;this.palSize=7;if(this.transparent!==null){this.transIndex=this.findClosest(this.transparent,true)}};GIFEncoder.prototype.indexPixels=function(imgq){var nPix=this.pixels.length/3;this.indexedPixels=new Uint8Array(nPix);var k=0;for(var j=0;j=0&&x1+x=0&&y1+y>16,(c&65280)>>8,c&255,used)};GIFEncoder.prototype.findClosestRGB=function(r,g,b,used){if(this.colorTab===null)return-1;if(this.neuQuant&&!used){return this.neuQuant.lookupRGB(r,g,b)}var c=b|g<<8|r<<16;var minpos=0;var dmin=256*256*256;var len=this.colorTab.length;for(var i=0,index=0;i=0){disp=dispose&7}disp<<=2;this.out.writeByte(0|disp|0|transp);this.writeShort(this.delay);this.out.writeByte(this.transIndex);this.out.writeByte(0)};GIFEncoder.prototype.writeImageDesc=function(){this.out.writeByte(44);this.writeShort(0);this.writeShort(0);this.writeShort(this.width);this.writeShort(this.height);if(this.firstFrame||this.globalPalette){this.out.writeByte(0)}else{this.out.writeByte(128|0|0|0|this.palSize)}};GIFEncoder.prototype.writeLSD=function(){this.writeShort(this.width);this.writeShort(this.height);this.out.writeByte(128|112|0|this.palSize);this.out.writeByte(0);this.out.writeByte(0)};GIFEncoder.prototype.writeNetscapeExt=function(){this.out.writeByte(33);this.out.writeByte(255);this.out.writeByte(11);this.out.writeUTFBytes("NETSCAPE2.0");this.out.writeByte(3);this.out.writeByte(1);this.writeShort(this.repeat);this.out.writeByte(0)};GIFEncoder.prototype.writePalette=function(){this.out.writeBytes(this.colorTab);var n=3*256-this.colorTab.length;for(var i=0;i>8&255)};GIFEncoder.prototype.writePixels=function(){var enc=new LZWEncoder(this.width,this.height,this.indexedPixels,this.colorDepth);enc.encode(this.out)};GIFEncoder.prototype.stream=function(){return this.out};module.exports=GIFEncoder},{"./LZWEncoder.js":2,"./TypedNeuQuant.js":3}],2:[function(require,module,exports){var EOF=-1;var BITS=12;var HSIZE=5003;var masks=[0,1,3,7,15,31,63,127,255,511,1023,2047,4095,8191,16383,32767,65535];function LZWEncoder(width,height,pixels,colorDepth){var initCodeSize=Math.max(2,colorDepth);var accum=new Uint8Array(256);var htab=new Int32Array(HSIZE);var codetab=new Int32Array(HSIZE);var cur_accum,cur_bits=0;var a_count;var free_ent=0;var maxcode;var clear_flg=false;var g_init_bits,ClearCode,EOFCode;function char_out(c,outs){accum[a_count++]=c;if(a_count>=254)flush_char(outs)}function cl_block(outs){cl_hash(HSIZE);free_ent=ClearCode+2;clear_flg=true;output(ClearCode,outs)}function cl_hash(hsize){for(var i=0;i=0){disp=hsize_reg-i;if(i===0)disp=1;do{if((i-=disp)<0)i+=hsize_reg;if(htab[i]===fcode){ent=codetab[i];continue outer_loop}}while(htab[i]>=0)}output(ent,outs);ent=c;if(free_ent<1<0){outs.writeByte(a_count);outs.writeBytes(accum,0,a_count);a_count=0}}function MAXCODE(n_bits){return(1<0)cur_accum|=code<=8){char_out(cur_accum&255,outs);cur_accum>>=8;cur_bits-=8}if(free_ent>maxcode||clear_flg){if(clear_flg){maxcode=MAXCODE(n_bits=g_init_bits);clear_flg=false}else{++n_bits;if(n_bits==BITS)maxcode=1<0){char_out(cur_accum&255,outs);cur_accum>>=8;cur_bits-=8}flush_char(outs)}}this.encode=encode}module.exports=LZWEncoder},{}],3:[function(require,module,exports){var ncycles=100;var netsize=256;var maxnetpos=netsize-1;var netbiasshift=4;var intbiasshift=16;var intbias=1<>betashift;var betagamma=intbias<>3;var radiusbiasshift=6;var radiusbias=1<>3);var i,v;for(i=0;i>=netbiasshift;network[i][1]>>=netbiasshift;network[i][2]>>=netbiasshift;network[i][3]=i}}function altersingle(alpha,i,b,g,r){network[i][0]-=alpha*(network[i][0]-b)/initalpha;network[i][1]-=alpha*(network[i][1]-g)/initalpha;network[i][2]-=alpha*(network[i][2]-r)/initalpha}function alterneigh(radius,i,b,g,r){var lo=Math.abs(i-radius);var hi=Math.min(i+radius,netsize);var j=i+1;var k=i-1;var m=1;var p,a;while(jlo){a=radpower[m++];if(jlo){p=network[k--];p[0]-=a*(p[0]-b)/alpharadbias;p[1]-=a*(p[1]-g)/alpharadbias;p[2]-=a*(p[2]-r)/alpharadbias}}}function contest(b,g,r){var bestd=~(1<<31);var bestbiasd=bestd;var bestpos=-1;var bestbiaspos=bestpos;var i,n,dist,biasdist,betafreq;for(i=0;i>intbiasshift-netbiasshift);if(biasdist>betashift;freq[i]-=betafreq;bias[i]+=betafreq<>1;for(j=previouscol+1;j>1;for(j=previouscol+1;j<256;j++)netindex[j]=maxnetpos}function inxsearch(b,g,r){var a,p,dist;var bestd=1e3;var best=-1;var i=netindex[g];var j=i-1;while(i=0){if(i=bestd)i=netsize;else{i++;if(dist<0)dist=-dist;a=p[0]-b;if(a<0)a=-a;dist+=a;if(dist=0){p=network[j];dist=g-p[1];if(dist>=bestd)j=-1;else{j--;if(dist<0)dist=-dist;a=p[0]-b;if(a<0)a=-a;dist+=a;if(dist>radiusbiasshift;if(rad<=1)rad=0;for(i=0;i=lengthcount)pix-=lengthcount;i++;if(delta===0)delta=1;if(i%delta===0){alpha-=alpha/alphadec;radius-=radius/radiusdec;rad=radius>>radiusbiasshift;if(rad<=1)rad=0;for(j=0;j= ByteArray.pageSize) this.newPage();\n this.pages[this.page][this.cursor++] = val;\n};\n\nByteArray.prototype.writeUTFBytes = function(string) {\n for (var l = string.length, i = 0; i < l; i++)\n this.writeByte(string.charCodeAt(i));\n};\n\nByteArray.prototype.writeBytes = function(array, offset, length) {\n for (var l = length || array.length, i = offset || 0; i < l; i++)\n this.writeByte(array[i]);\n};\n\nfunction GIFEncoder(width, height) {\n // image size\n this.width = ~~width;\n this.height = ~~height;\n\n // transparent color if given\n this.transparent = null;\n\n // transparent index in color table\n this.transIndex = 0;\n\n // -1 = no repeat, 0 = forever. anything else is repeat count\n this.repeat = -1;\n\n // frame delay (hundredths)\n this.delay = 0;\n\n this.image = null; // current frame\n this.pixels = null; // BGR byte array from frame\n this.indexedPixels = null; // converted frame indexed to palette\n this.colorDepth = null; // number of bit planes\n this.colorTab = null; // RGB palette\n this.neuQuant = null; // NeuQuant instance that was used to generate this.colorTab.\n this.usedEntry = new Array(); // active palette entries\n this.palSize = 7; // color table size (bits-1)\n this.dispose = -1; // disposal code (-1 = use default)\n this.firstFrame = true;\n this.sample = 10; // default sample interval for quantizer\n this.dither = false; // default dithering\n this.globalPalette = false;\n\n this.out = new ByteArray();\n}\n\n/*\n Sets the delay time between each frame, or changes it for subsequent frames\n (applies to last frame added)\n*/\nGIFEncoder.prototype.setDelay = function(milliseconds) {\n this.delay = Math.round(milliseconds / 10);\n};\n\n/*\n Sets frame rate in frames per second.\n*/\nGIFEncoder.prototype.setFrameRate = function(fps) {\n this.delay = Math.round(100 / fps);\n};\n\n/*\n Sets the GIF frame disposal code for the last added frame and any\n subsequent frames.\n\n Default is 0 if no transparent color has been set, otherwise 2.\n*/\nGIFEncoder.prototype.setDispose = function(disposalCode) {\n if (disposalCode >= 0) this.dispose = disposalCode;\n};\n\n/*\n Sets the number of times the set of GIF frames should be played.\n\n -1 = play once\n 0 = repeat indefinitely\n\n Default is -1\n\n Must be invoked before the first image is added\n*/\n\nGIFEncoder.prototype.setRepeat = function(repeat) {\n this.repeat = repeat;\n};\n\n/*\n Sets the transparent color for the last added frame and any subsequent\n frames. Since all colors are subject to modification in the quantization\n process, the color in the final palette for each frame closest to the given\n color becomes the transparent color for that frame. May be set to null to\n indicate no transparent color.\n*/\nGIFEncoder.prototype.setTransparent = function(color) {\n this.transparent = color;\n};\n\n/*\n Adds next GIF frame. The frame is not written immediately, but is\n actually deferred until the next frame is received so that timing\n data can be inserted. Invoking finish() flushes all frames.\n*/\nGIFEncoder.prototype.addFrame = function(imageData) {\n this.image = imageData;\n\n this.colorTab = this.globalPalette && this.globalPalette.slice ? this.globalPalette : null;\n\n this.getImagePixels(); // convert to correct format if necessary\n this.analyzePixels(); // build color table & map pixels\n\n if (this.globalPalette === true) this.globalPalette = this.colorTab;\n\n if (this.firstFrame) {\n this.writeLSD(); // logical screen descriptior\n this.writePalette(); // global color table\n if (this.repeat >= 0) {\n // use NS app extension to indicate reps\n this.writeNetscapeExt();\n }\n }\n\n this.writeGraphicCtrlExt(); // write graphic control extension\n this.writeImageDesc(); // image descriptor\n if (!this.firstFrame && !this.globalPalette) this.writePalette(); // local color table\n this.writePixels(); // encode and write pixel data\n\n this.firstFrame = false;\n};\n\n/*\n Adds final trailer to the GIF stream, if you don't call the finish method\n the GIF stream will not be valid.\n*/\nGIFEncoder.prototype.finish = function() {\n this.out.writeByte(0x3b); // gif trailer\n};\n\n/*\n Sets quality of color quantization (conversion of images to the maximum 256\n colors allowed by the GIF specification). Lower values (minimum = 1)\n produce better colors, but slow processing significantly. 10 is the\n default, and produces good color mapping at reasonable speeds. Values\n greater than 20 do not yield significant improvements in speed.\n*/\nGIFEncoder.prototype.setQuality = function(quality) {\n if (quality < 1) quality = 1;\n this.sample = quality;\n};\n\n/*\n Sets dithering method. Available are:\n - FALSE no dithering\n - TRUE or FloydSteinberg\n - FalseFloydSteinberg\n - Stucki\n - Atkinson\n You can add '-serpentine' to use serpentine scanning\n*/\nGIFEncoder.prototype.setDither = function(dither) {\n if (dither === true) dither = 'FloydSteinberg';\n this.dither = dither;\n};\n\n/*\n Sets global palette for all frames.\n You can provide TRUE to create global palette from first picture.\n Or an array of r,g,b,r,g,b,...\n*/\nGIFEncoder.prototype.setGlobalPalette = function(palette) {\n this.globalPalette = palette;\n};\n\n/*\n Returns global palette used for all frames.\n If setGlobalPalette(true) was used, then this function will return\n calculated palette after the first frame is added.\n*/\nGIFEncoder.prototype.getGlobalPalette = function() {\n return (this.globalPalette && this.globalPalette.slice && this.globalPalette.slice(0)) || this.globalPalette;\n};\n\n/*\n Writes GIF file header\n*/\nGIFEncoder.prototype.writeHeader = function() {\n this.out.writeUTFBytes(\"GIF89a\");\n};\n\n/*\n Analyzes current frame colors and creates color map.\n*/\nGIFEncoder.prototype.analyzePixels = function() {\n if (!this.colorTab) {\n this.neuQuant = new NeuQuant(this.pixels, this.sample);\n this.neuQuant.buildColormap(); // create reduced palette\n this.colorTab = this.neuQuant.getColormap();\n }\n\n // map image pixels to new palette\n if (this.dither) {\n this.ditherPixels(this.dither.replace('-serpentine', ''), this.dither.match(/-serpentine/) !== null);\n } else {\n this.indexPixels();\n }\n\n this.pixels = null;\n this.colorDepth = 8;\n this.palSize = 7;\n\n // get closest match to transparent color if specified\n if (this.transparent !== null) {\n this.transIndex = this.findClosest(this.transparent, true);\n }\n};\n\n/*\n Index pixels, without dithering\n*/\nGIFEncoder.prototype.indexPixels = function(imgq) {\n var nPix = this.pixels.length / 3;\n this.indexedPixels = new Uint8Array(nPix);\n var k = 0;\n for (var j = 0; j < nPix; j++) {\n var index = this.findClosestRGB(\n this.pixels[k++] & 0xff,\n this.pixels[k++] & 0xff,\n this.pixels[k++] & 0xff\n );\n this.usedEntry[index] = true;\n this.indexedPixels[j] = index;\n }\n};\n\n/*\n Taken from http://jsbin.com/iXofIji/2/edit by PAEz\n*/\nGIFEncoder.prototype.ditherPixels = function(kernel, serpentine) {\n var kernels = {\n FalseFloydSteinberg: [\n [3 / 8, 1, 0],\n [3 / 8, 0, 1],\n [2 / 8, 1, 1]\n ],\n FloydSteinberg: [\n [7 / 16, 1, 0],\n [3 / 16, -1, 1],\n [5 / 16, 0, 1],\n [1 / 16, 1, 1]\n ],\n Stucki: [\n [8 / 42, 1, 0],\n [4 / 42, 2, 0],\n [2 / 42, -2, 1],\n [4 / 42, -1, 1],\n [8 / 42, 0, 1],\n [4 / 42, 1, 1],\n [2 / 42, 2, 1],\n [1 / 42, -2, 2],\n [2 / 42, -1, 2],\n [4 / 42, 0, 2],\n [2 / 42, 1, 2],\n [1 / 42, 2, 2]\n ],\n Atkinson: [\n [1 / 8, 1, 0],\n [1 / 8, 2, 0],\n [1 / 8, -1, 1],\n [1 / 8, 0, 1],\n [1 / 8, 1, 1],\n [1 / 8, 0, 2]\n ]\n };\n\n if (!kernel || !kernels[kernel]) {\n throw 'Unknown dithering kernel: ' + kernel;\n }\n\n var ds = kernels[kernel];\n var index = 0,\n height = this.height,\n width = this.width,\n data = this.pixels;\n var direction = serpentine ? -1 : 1;\n\n this.indexedPixels = new Uint8Array(this.pixels.length / 3);\n\n for (var y = 0; y < height; y++) {\n\n if (serpentine) direction = direction * -1;\n\n for (var x = (direction == 1 ? 0 : width - 1), xend = (direction == 1 ? width : 0); x !== xend; x += direction) {\n\n index = (y * width) + x;\n // Get original colour\n var idx = index * 3;\n var r1 = data[idx];\n var g1 = data[idx + 1];\n var b1 = data[idx + 2];\n\n // Get converted colour\n idx = this.findClosestRGB(r1, g1, b1);\n this.usedEntry[idx] = true;\n this.indexedPixels[index] = idx;\n idx *= 3;\n var r2 = this.colorTab[idx];\n var g2 = this.colorTab[idx + 1];\n var b2 = this.colorTab[idx + 2];\n\n var er = r1 - r2;\n var eg = g1 - g2;\n var eb = b1 - b2;\n\n for (var i = (direction == 1 ? 0: ds.length - 1), end = (direction == 1 ? ds.length : 0); i !== end; i += direction) {\n var x1 = ds[i][1]; // *direction; // Should this by timesd by direction?..to make the kernel go in the opposite direction....got no idea....\n var y1 = ds[i][2];\n if (x1 + x >= 0 && x1 + x < width && y1 + y >= 0 && y1 + y < height) {\n var d = ds[i][0];\n idx = index + x1 + (y1 * width);\n idx *= 3;\n\n data[idx] = Math.max(0, Math.min(255, data[idx] + er * d));\n data[idx + 1] = Math.max(0, Math.min(255, data[idx + 1] + eg * d));\n data[idx + 2] = Math.max(0, Math.min(255, data[idx + 2] + eb * d));\n }\n }\n }\n }\n};\n\n/*\n Returns index of palette color closest to c\n*/\nGIFEncoder.prototype.findClosest = function(c, used) {\n return this.findClosestRGB((c & 0xFF0000) >> 16, (c & 0x00FF00) >> 8, (c & 0x0000FF), used);\n};\n\nGIFEncoder.prototype.findClosestRGB = function(r, g, b, used) {\n if (this.colorTab === null) return -1;\n\n if (this.neuQuant && !used) {\n return this.neuQuant.lookupRGB(r, g, b);\n }\n \n var c = b | (g << 8) | (r << 16);\n\n var minpos = 0;\n var dmin = 256 * 256 * 256;\n var len = this.colorTab.length;\n\n for (var i = 0, index = 0; i < len; index++) {\n var dr = r - (this.colorTab[i++] & 0xff);\n var dg = g - (this.colorTab[i++] & 0xff);\n var db = b - (this.colorTab[i++] & 0xff);\n var d = dr * dr + dg * dg + db * db;\n if ((!used || this.usedEntry[index]) && (d < dmin)) {\n dmin = d;\n minpos = index;\n }\n }\n\n return minpos;\n};\n\n/*\n Extracts image pixels into byte array pixels\n (removes alphachannel from canvas imagedata)\n*/\nGIFEncoder.prototype.getImagePixels = function() {\n var w = this.width;\n var h = this.height;\n this.pixels = new Uint8Array(w * h * 3);\n\n var data = this.image;\n var srcPos = 0;\n var count = 0;\n\n for (var i = 0; i < h; i++) {\n for (var j = 0; j < w; j++) {\n this.pixels[count++] = data[srcPos++];\n this.pixels[count++] = data[srcPos++];\n this.pixels[count++] = data[srcPos++];\n srcPos++;\n }\n }\n};\n\n/*\n Writes Graphic Control Extension\n*/\nGIFEncoder.prototype.writeGraphicCtrlExt = function() {\n this.out.writeByte(0x21); // extension introducer\n this.out.writeByte(0xf9); // GCE label\n this.out.writeByte(4); // data block size\n\n var transp, disp;\n if (this.transparent === null) {\n transp = 0;\n disp = 0; // dispose = no action\n } else {\n transp = 1;\n disp = 2; // force clear if using transparent color\n }\n\n if (this.dispose >= 0) {\n disp = dispose & 7; // user override\n }\n disp <<= 2;\n\n // packed fields\n this.out.writeByte(\n 0 | // 1:3 reserved\n disp | // 4:6 disposal\n 0 | // 7 user input - 0 = none\n transp // 8 transparency flag\n );\n\n this.writeShort(this.delay); // delay x 1/100 sec\n this.out.writeByte(this.transIndex); // transparent color index\n this.out.writeByte(0); // block terminator\n};\n\n/*\n Writes Image Descriptor\n*/\nGIFEncoder.prototype.writeImageDesc = function() {\n this.out.writeByte(0x2c); // image separator\n this.writeShort(0); // image position x,y = 0,0\n this.writeShort(0);\n this.writeShort(this.width); // image size\n this.writeShort(this.height);\n\n // packed fields\n if (this.firstFrame || this.globalPalette) {\n // no LCT - GCT is used for first (or only) frame\n this.out.writeByte(0);\n } else {\n // specify normal LCT\n this.out.writeByte(\n 0x80 | // 1 local color table 1=yes\n 0 | // 2 interlace - 0=no\n 0 | // 3 sorted - 0=no\n 0 | // 4-5 reserved\n this.palSize // 6-8 size of color table\n );\n }\n};\n\n/*\n Writes Logical Screen Descriptor\n*/\nGIFEncoder.prototype.writeLSD = function() {\n // logical screen size\n this.writeShort(this.width);\n this.writeShort(this.height);\n\n // packed fields\n this.out.writeByte(\n 0x80 | // 1 : global color table flag = 1 (gct used)\n 0x70 | // 2-4 : color resolution = 7\n 0x00 | // 5 : gct sort flag = 0\n this.palSize // 6-8 : gct size\n );\n\n this.out.writeByte(0); // background color index\n this.out.writeByte(0); // pixel aspect ratio - assume 1:1\n};\n\n/*\n Writes Netscape application extension to define repeat count.\n*/\nGIFEncoder.prototype.writeNetscapeExt = function() {\n this.out.writeByte(0x21); // extension introducer\n this.out.writeByte(0xff); // app extension label\n this.out.writeByte(11); // block size\n this.out.writeUTFBytes('NETSCAPE2.0'); // app id + auth code\n this.out.writeByte(3); // sub-block size\n this.out.writeByte(1); // loop sub-block id\n this.writeShort(this.repeat); // loop count (extra iterations, 0=repeat forever)\n this.out.writeByte(0); // block terminator\n};\n\n/*\n Writes color table\n*/\nGIFEncoder.prototype.writePalette = function() {\n this.out.writeBytes(this.colorTab);\n var n = (3 * 256) - this.colorTab.length;\n for (var i = 0; i < n; i++)\n this.out.writeByte(0);\n};\n\nGIFEncoder.prototype.writeShort = function(pValue) {\n this.out.writeByte(pValue & 0xFF);\n this.out.writeByte((pValue >> 8) & 0xFF);\n};\n\n/*\n Encodes and writes pixel data\n*/\nGIFEncoder.prototype.writePixels = function() {\n var enc = new LZWEncoder(this.width, this.height, this.indexedPixels, this.colorDepth);\n enc.encode(this.out);\n};\n\n/*\n Retrieves the GIF stream\n*/\nGIFEncoder.prototype.stream = function() {\n return this.out;\n};\n\nmodule.exports = GIFEncoder;\n","/*\n LZWEncoder.js\n\n Authors\n Kevin Weiner (original Java version - kweiner@fmsware.com)\n Thibault Imbert (AS3 version - bytearray.org)\n Johan Nordberg (JS version - code@johan-nordberg.com)\n\n Acknowledgements\n GIFCOMPR.C - GIF Image compression routines\n Lempel-Ziv compression based on 'compress'. GIF modifications by\n David Rowley (mgardi@watdcsu.waterloo.edu)\n GIF Image compression - modified 'compress'\n Based on: compress.c - File compression ala IEEE Computer, June 1984.\n By Authors: Spencer W. Thomas (decvax!harpo!utah-cs!utah-gr!thomas)\n Jim McKie (decvax!mcvax!jim)\n Steve Davies (decvax!vax135!petsd!peora!srd)\n Ken Turkowski (decvax!decwrl!turtlevax!ken)\n James A. Woods (decvax!ihnp4!ames!jaw)\n Joe Orost (decvax!vax135!petsd!joe)\n*/\n\nvar EOF = -1;\nvar BITS = 12;\nvar HSIZE = 5003; // 80% occupancy\nvar masks = [0x0000, 0x0001, 0x0003, 0x0007, 0x000F, 0x001F,\n 0x003F, 0x007F, 0x00FF, 0x01FF, 0x03FF, 0x07FF,\n 0x0FFF, 0x1FFF, 0x3FFF, 0x7FFF, 0xFFFF];\n\nfunction LZWEncoder(width, height, pixels, colorDepth) {\n var initCodeSize = Math.max(2, colorDepth);\n\n var accum = new Uint8Array(256);\n var htab = new Int32Array(HSIZE);\n var codetab = new Int32Array(HSIZE);\n\n var cur_accum, cur_bits = 0;\n var a_count;\n var free_ent = 0; // first unused entry\n var maxcode;\n\n // block compression parameters -- after all codes are used up,\n // and compression rate changes, start over.\n var clear_flg = false;\n\n // Algorithm: use open addressing double hashing (no chaining) on the\n // prefix code / next character combination. We do a variant of Knuth's\n // algorithm D (vol. 3, sec. 6.4) along with G. Knott's relatively-prime\n // secondary probe. Here, the modular division first probe is gives way\n // to a faster exclusive-or manipulation. Also do block compression with\n // an adaptive reset, whereby the code table is cleared when the compression\n // ratio decreases, but after the table fills. The variable-length output\n // codes are re-sized at this point, and a special CLEAR code is generated\n // for the decompressor. Late addition: construct the table according to\n // file size for noticeable speed improvement on small files. Please direct\n // questions about this implementation to ames!jaw.\n var g_init_bits, ClearCode, EOFCode;\n\n // Add a character to the end of the current packet, and if it is 254\n // characters, flush the packet to disk.\n function char_out(c, outs) {\n accum[a_count++] = c;\n if (a_count >= 254) flush_char(outs);\n }\n\n // Clear out the hash table\n // table clear for block compress\n function cl_block(outs) {\n cl_hash(HSIZE);\n free_ent = ClearCode + 2;\n clear_flg = true;\n output(ClearCode, outs);\n }\n\n // Reset code table\n function cl_hash(hsize) {\n for (var i = 0; i < hsize; ++i) htab[i] = -1;\n }\n\n function compress(init_bits, outs) {\n var fcode, c, i, ent, disp, hsize_reg, hshift;\n\n // Set up the globals: g_init_bits - initial number of bits\n g_init_bits = init_bits;\n\n // Set up the necessary values\n clear_flg = false;\n n_bits = g_init_bits;\n maxcode = MAXCODE(n_bits);\n\n ClearCode = 1 << (init_bits - 1);\n EOFCode = ClearCode + 1;\n free_ent = ClearCode + 2;\n\n a_count = 0; // clear packet\n\n ent = nextPixel();\n\n hshift = 0;\n for (fcode = HSIZE; fcode < 65536; fcode *= 2) ++hshift;\n hshift = 8 - hshift; // set hash code range bound\n hsize_reg = HSIZE;\n cl_hash(hsize_reg); // clear hash table\n\n output(ClearCode, outs);\n\n outer_loop: while ((c = nextPixel()) != EOF) {\n fcode = (c << BITS) + ent;\n i = (c << hshift) ^ ent; // xor hashing\n if (htab[i] === fcode) {\n ent = codetab[i];\n continue;\n } else if (htab[i] >= 0) { // non-empty slot\n disp = hsize_reg - i; // secondary hash (after G. Knott)\n if (i === 0) disp = 1;\n do {\n if ((i -= disp) < 0) i += hsize_reg;\n if (htab[i] === fcode) {\n ent = codetab[i];\n continue outer_loop;\n }\n } while (htab[i] >= 0);\n }\n output(ent, outs);\n ent = c;\n if (free_ent < 1 << BITS) {\n codetab[i] = free_ent++; // code -> hashtable\n htab[i] = fcode;\n } else {\n cl_block(outs);\n }\n }\n\n // Put out the final code.\n output(ent, outs);\n output(EOFCode, outs);\n }\n\n function encode(outs) {\n outs.writeByte(initCodeSize); // write \"initial code size\" byte\n remaining = width * height; // reset navigation variables\n curPixel = 0;\n compress(initCodeSize + 1, outs); // compress and write the pixel data\n outs.writeByte(0); // write block terminator\n }\n\n // Flush the packet to disk, and reset the accumulator\n function flush_char(outs) {\n if (a_count > 0) {\n outs.writeByte(a_count);\n outs.writeBytes(accum, 0, a_count);\n a_count = 0;\n }\n }\n\n function MAXCODE(n_bits) {\n return (1 << n_bits) - 1;\n }\n\n // Return the next pixel from the image\n function nextPixel() {\n if (remaining === 0) return EOF;\n --remaining;\n var pix = pixels[curPixel++];\n return pix & 0xff;\n }\n\n function output(code, outs) {\n cur_accum &= masks[cur_bits];\n\n if (cur_bits > 0) cur_accum |= (code << cur_bits);\n else cur_accum = code;\n\n cur_bits += n_bits;\n\n while (cur_bits >= 8) {\n char_out((cur_accum & 0xff), outs);\n cur_accum >>= 8;\n cur_bits -= 8;\n }\n\n // If the next entry is going to be too big for the code size,\n // then increase it, if possible.\n if (free_ent > maxcode || clear_flg) {\n if (clear_flg) {\n maxcode = MAXCODE(n_bits = g_init_bits);\n clear_flg = false;\n } else {\n ++n_bits;\n if (n_bits == BITS) maxcode = 1 << BITS;\n else maxcode = MAXCODE(n_bits);\n }\n }\n\n if (code == EOFCode) {\n // At EOF, write the rest of the buffer.\n while (cur_bits > 0) {\n char_out((cur_accum & 0xff), outs);\n cur_accum >>= 8;\n cur_bits -= 8;\n }\n flush_char(outs);\n }\n }\n\n this.encode = encode;\n}\n\nmodule.exports = LZWEncoder;\n","/* NeuQuant Neural-Net Quantization Algorithm\n * ------------------------------------------\n *\n * Copyright (c) 1994 Anthony Dekker\n *\n * NEUQUANT Neural-Net quantization algorithm by Anthony Dekker, 1994.\n * See \"Kohonen neural networks for optimal colour quantization\"\n * in \"Network: Computation in Neural Systems\" Vol. 5 (1994) pp 351-367.\n * for a discussion of the algorithm.\n * See also http://members.ozemail.com.au/~dekker/NEUQUANT.HTML\n *\n * Any party obtaining a copy of these files from the author, directly or\n * indirectly, is granted, free of charge, a full and unrestricted irrevocable,\n * world-wide, paid up, royalty-free, nonexclusive right and license to deal\n * in this software and documentation files (the \"Software\"), including without\n * limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,\n * and/or sell copies of the Software, and to permit persons who receive\n * copies from any such party to do so, with the only requirement being\n * that this copyright notice remain intact.\n *\n * (JavaScript port 2012 by Johan Nordberg)\n */\n\nvar ncycles = 100; // number of learning cycles\nvar netsize = 256; // number of colors used\nvar maxnetpos = netsize - 1;\n\n// defs for freq and bias\nvar netbiasshift = 4; // bias for colour values\nvar intbiasshift = 16; // bias for fractions\nvar intbias = (1 << intbiasshift);\nvar gammashift = 10;\nvar gamma = (1 << gammashift);\nvar betashift = 10;\nvar beta = (intbias >> betashift); /* beta = 1/1024 */\nvar betagamma = (intbias << (gammashift - betashift));\n\n// defs for decreasing radius factor\nvar initrad = (netsize >> 3); // for 256 cols, radius starts\nvar radiusbiasshift = 6; // at 32.0 biased by 6 bits\nvar radiusbias = (1 << radiusbiasshift);\nvar initradius = (initrad * radiusbias); //and decreases by a\nvar radiusdec = 30; // factor of 1/30 each cycle\n\n// defs for decreasing alpha factor\nvar alphabiasshift = 10; // alpha starts at 1.0\nvar initalpha = (1 << alphabiasshift);\nvar alphadec; // biased by 10 bits\n\n/* radbias and alpharadbias used for radpower calculation */\nvar radbiasshift = 8;\nvar radbias = (1 << radbiasshift);\nvar alpharadbshift = (alphabiasshift + radbiasshift);\nvar alpharadbias = (1 << alpharadbshift);\n\n// four primes near 500 - assume no image has a length so large that it is\n// divisible by all four primes\nvar prime1 = 499;\nvar prime2 = 491;\nvar prime3 = 487;\nvar prime4 = 503;\nvar minpicturebytes = (3 * prime4);\n\n/*\n Constructor: NeuQuant\n\n Arguments:\n\n pixels - array of pixels in RGB format\n samplefac - sampling factor 1 to 30 where lower is better quality\n\n >\n > pixels = [r, g, b, r, g, b, r, g, b, ..]\n >\n*/\nfunction NeuQuant(pixels, samplefac) {\n var network; // int[netsize][4]\n var netindex; // for network lookup - really 256\n\n // bias and freq arrays for learning\n var bias;\n var freq;\n var radpower;\n\n /*\n Private Method: init\n\n sets up arrays\n */\n function init() {\n network = [];\n netindex = new Int32Array(256);\n bias = new Int32Array(netsize);\n freq = new Int32Array(netsize);\n radpower = new Int32Array(netsize >> 3);\n\n var i, v;\n for (i = 0; i < netsize; i++) {\n v = (i << (netbiasshift + 8)) / netsize;\n network[i] = new Float64Array([v, v, v, 0]);\n //network[i] = [v, v, v, 0]\n freq[i] = intbias / netsize;\n bias[i] = 0;\n }\n }\n\n /*\n Private Method: unbiasnet\n\n unbiases network to give byte values 0..255 and record position i to prepare for sort\n */\n function unbiasnet() {\n for (var i = 0; i < netsize; i++) {\n network[i][0] >>= netbiasshift;\n network[i][1] >>= netbiasshift;\n network[i][2] >>= netbiasshift;\n network[i][3] = i; // record color number\n }\n }\n\n /*\n Private Method: altersingle\n\n moves neuron *i* towards biased (b,g,r) by factor *alpha*\n */\n function altersingle(alpha, i, b, g, r) {\n network[i][0] -= (alpha * (network[i][0] - b)) / initalpha;\n network[i][1] -= (alpha * (network[i][1] - g)) / initalpha;\n network[i][2] -= (alpha * (network[i][2] - r)) / initalpha;\n }\n\n /*\n Private Method: alterneigh\n\n moves neurons in *radius* around index *i* towards biased (b,g,r) by factor *alpha*\n */\n function alterneigh(radius, i, b, g, r) {\n var lo = Math.abs(i - radius);\n var hi = Math.min(i + radius, netsize);\n\n var j = i + 1;\n var k = i - 1;\n var m = 1;\n\n var p, a;\n while ((j < hi) || (k > lo)) {\n a = radpower[m++];\n\n if (j < hi) {\n p = network[j++];\n p[0] -= (a * (p[0] - b)) / alpharadbias;\n p[1] -= (a * (p[1] - g)) / alpharadbias;\n p[2] -= (a * (p[2] - r)) / alpharadbias;\n }\n\n if (k > lo) {\n p = network[k--];\n p[0] -= (a * (p[0] - b)) / alpharadbias;\n p[1] -= (a * (p[1] - g)) / alpharadbias;\n p[2] -= (a * (p[2] - r)) / alpharadbias;\n }\n }\n }\n\n /*\n Private Method: contest\n\n searches for biased BGR values\n */\n function contest(b, g, r) {\n /*\n finds closest neuron (min dist) and updates freq\n finds best neuron (min dist-bias) and returns position\n for frequently chosen neurons, freq[i] is high and bias[i] is negative\n bias[i] = gamma * ((1 / netsize) - freq[i])\n */\n\n var bestd = ~(1 << 31);\n var bestbiasd = bestd;\n var bestpos = -1;\n var bestbiaspos = bestpos;\n\n var i, n, dist, biasdist, betafreq;\n for (i = 0; i < netsize; i++) {\n n = network[i];\n\n dist = Math.abs(n[0] - b) + Math.abs(n[1] - g) + Math.abs(n[2] - r);\n if (dist < bestd) {\n bestd = dist;\n bestpos = i;\n }\n\n biasdist = dist - ((bias[i]) >> (intbiasshift - netbiasshift));\n if (biasdist < bestbiasd) {\n bestbiasd = biasdist;\n bestbiaspos = i;\n }\n\n betafreq = (freq[i] >> betashift);\n freq[i] -= betafreq;\n bias[i] += (betafreq << gammashift);\n }\n\n freq[bestpos] += beta;\n bias[bestpos] -= betagamma;\n\n return bestbiaspos;\n }\n\n /*\n Private Method: inxbuild\n\n sorts network and builds netindex[0..255]\n */\n function inxbuild() {\n var i, j, p, q, smallpos, smallval, previouscol = 0, startpos = 0;\n for (i = 0; i < netsize; i++) {\n p = network[i];\n smallpos = i;\n smallval = p[1]; // index on g\n // find smallest in i..netsize-1\n for (j = i + 1; j < netsize; j++) {\n q = network[j];\n if (q[1] < smallval) { // index on g\n smallpos = j;\n smallval = q[1]; // index on g\n }\n }\n q = network[smallpos];\n // swap p (i) and q (smallpos) entries\n if (i != smallpos) {\n j = q[0]; q[0] = p[0]; p[0] = j;\n j = q[1]; q[1] = p[1]; p[1] = j;\n j = q[2]; q[2] = p[2]; p[2] = j;\n j = q[3]; q[3] = p[3]; p[3] = j;\n }\n // smallval entry is now in position i\n\n if (smallval != previouscol) {\n netindex[previouscol] = (startpos + i) >> 1;\n for (j = previouscol + 1; j < smallval; j++)\n netindex[j] = i;\n previouscol = smallval;\n startpos = i;\n }\n }\n netindex[previouscol] = (startpos + maxnetpos) >> 1;\n for (j = previouscol + 1; j < 256; j++)\n netindex[j] = maxnetpos; // really 256\n }\n\n /*\n Private Method: inxsearch\n\n searches for BGR values 0..255 and returns a color index\n */\n function inxsearch(b, g, r) {\n var a, p, dist;\n\n var bestd = 1000; // biggest possible dist is 256*3\n var best = -1;\n\n var i = netindex[g]; // index on g\n var j = i - 1; // start at netindex[g] and work outwards\n\n while ((i < netsize) || (j >= 0)) {\n if (i < netsize) {\n p = network[i];\n dist = p[1] - g; // inx key\n if (dist >= bestd) i = netsize; // stop iter\n else {\n i++;\n if (dist < 0) dist = -dist;\n a = p[0] - b; if (a < 0) a = -a;\n dist += a;\n if (dist < bestd) {\n a = p[2] - r; if (a < 0) a = -a;\n dist += a;\n if (dist < bestd) {\n bestd = dist;\n best = p[3];\n }\n }\n }\n }\n if (j >= 0) {\n p = network[j];\n dist = g - p[1]; // inx key - reverse dif\n if (dist >= bestd) j = -1; // stop iter\n else {\n j--;\n if (dist < 0) dist = -dist;\n a = p[0] - b; if (a < 0) a = -a;\n dist += a;\n if (dist < bestd) {\n a = p[2] - r; if (a < 0) a = -a;\n dist += a;\n if (dist < bestd) {\n bestd = dist;\n best = p[3];\n }\n }\n }\n }\n }\n\n return best;\n }\n\n /*\n Private Method: learn\n\n \"Main Learning Loop\"\n */\n function learn() {\n var i;\n\n var lengthcount = pixels.length;\n var alphadec = 30 + ((samplefac - 1) / 3);\n var samplepixels = lengthcount / (3 * samplefac);\n var delta = ~~(samplepixels / ncycles);\n var alpha = initalpha;\n var radius = initradius;\n\n var rad = radius >> radiusbiasshift;\n\n if (rad <= 1) rad = 0;\n for (i = 0; i < rad; i++)\n radpower[i] = alpha * (((rad * rad - i * i) * radbias) / (rad * rad));\n\n var step;\n if (lengthcount < minpicturebytes) {\n samplefac = 1;\n step = 3;\n } else if ((lengthcount % prime1) !== 0) {\n step = 3 * prime1;\n } else if ((lengthcount % prime2) !== 0) {\n step = 3 * prime2;\n } else if ((lengthcount % prime3) !== 0) {\n step = 3 * prime3;\n } else {\n step = 3 * prime4;\n }\n\n var b, g, r, j;\n var pix = 0; // current pixel\n\n i = 0;\n while (i < samplepixels) {\n b = (pixels[pix] & 0xff) << netbiasshift;\n g = (pixels[pix + 1] & 0xff) << netbiasshift;\n r = (pixels[pix + 2] & 0xff) << netbiasshift;\n\n j = contest(b, g, r);\n\n altersingle(alpha, j, b, g, r);\n if (rad !== 0) alterneigh(rad, j, b, g, r); // alter neighbours\n\n pix += step;\n if (pix >= lengthcount) pix -= lengthcount;\n\n i++;\n\n if (delta === 0) delta = 1;\n if (i % delta === 0) {\n alpha -= alpha / alphadec;\n radius -= radius / radiusdec;\n rad = radius >> radiusbiasshift;\n\n if (rad <= 1) rad = 0;\n for (j = 0; j < rad; j++)\n radpower[j] = alpha * (((rad * rad - j * j) * radbias) / (rad * rad));\n }\n }\n }\n\n /*\n Method: buildColormap\n\n 1. initializes network\n 2. trains it\n 3. removes misconceptions\n 4. builds colorindex\n */\n function buildColormap() {\n init();\n learn();\n unbiasnet();\n inxbuild();\n }\n this.buildColormap = buildColormap;\n\n /*\n Method: getColormap\n\n builds colormap from the index\n\n returns array in the format:\n\n >\n > [r, g, b, r, g, b, r, g, b, ..]\n >\n */\n function getColormap() {\n var map = [];\n var index = [];\n\n for (var i = 0; i < netsize; i++)\n index[network[i][3]] = i;\n\n var k = 0;\n for (var l = 0; l < netsize; l++) {\n var j = index[l];\n map[k++] = (network[j][0]);\n map[k++] = (network[j][1]);\n map[k++] = (network[j][2]);\n }\n return map;\n }\n this.getColormap = getColormap;\n\n /*\n Method: lookupRGB\n\n looks for the closest *r*, *g*, *b* color in the map and\n returns its index\n */\n this.lookupRGB = inxsearch;\n}\n\nmodule.exports = NeuQuant;\n","GIFEncoder = require './GIFEncoder.js'\n\nrenderFrame = (frame) ->\n encoder = new GIFEncoder frame.width, frame.height\n\n if frame.index is 0\n encoder.writeHeader()\n else\n encoder.firstFrame = false\n\n encoder.setTransparent frame.transparent\n encoder.setRepeat frame.repeat\n encoder.setDelay frame.delay\n encoder.setQuality frame.quality\n encoder.setDither frame.dither\n encoder.setGlobalPalette frame.globalPalette\n encoder.addFrame frame.data\n encoder.finish() if frame.last\n if frame.globalPalette == true\n frame.globalPalette = encoder.getGlobalPalette()\n\n stream = encoder.stream()\n frame.data = stream.pages\n frame.cursor = stream.cursor\n frame.pageSize = stream.constructor.pageSize\n\n if frame.canTransfer\n transfer = (page.buffer for page in frame.data)\n self.postMessage frame, transfer\n else\n self.postMessage frame\n\nself.onmessage = (event) -> renderFrame event.data\n"]} \ No newline at end of file Index: openacs-4/packages/proctoring-support/www/resources/proctored-page.css =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/resources/Attic/proctored-page.css,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/resources/proctored-page.css 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,70 @@ +#wizard { + background-color: #ffffff; + margin: 100px auto; + padding: 40px; + width: 70%; + min-width: 300px; +} + +/* Hide all steps by default: */ +.tab { + display: none; +} + +#retryBtn { + display: none; +} + +#prevBtn { + background-color: #bbbbbb; +} + +/* Make circles that indicate the steps of the form: */ +.step { + height: 15px; + width: 15px; + margin: 0 2px; + background-color: #bbbbbb; + border: none; + border-radius: 50%; + display: inline-block; + opacity: 0.5; +} + +.step.active { + opacity: 1; +} + +/* Mark the steps that are finished and valid: */ +.step.finish { + background-color: #4CAF50; +} + +.wizard-video { + min-width: 100px; + margin: 0 auto; + display: none; +} + +#wizard { + display:none; +} + +#error-message { + margin-top: 20px +} + +.info_proctoring { + position: sticky; + top: 0px; + z-index: 1; + background: red; + color: white; + text-align: center; + padding: 5px; +} + +#steps { + text-align: center; + margin-top: 40px; +} \ No newline at end of file Index: openacs-4/packages/proctoring-support/www/resources/proctored-page.js =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/resources/Attic/proctored-page.js,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/resources/proctored-page.js 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,466 @@ +function modalAlert(message, handler) { + document.querySelector("#modal-messages .modal-body").innerHTML = message; + dialog = $('#modal-messages'); + if (typeof handler == 'function') { + dialog.on('hidden.bs.modal', handler); + } + dialog.modal('show'); +} + +function streamMuted(stream) { + var muted = false; + var audioTracks = stream.getAudioTracks(); + for (var i = 0; i < audioTracks.length; i++) { + var track = audioTracks[i]; + if (track.muted || + track.getSettings().volume == 0) { + muted = true; + break; + } + } + return muted; +} + +function embedAudioTrackFromStream(fromStream, toStream) { + if (fromStream == undefined) { + return; + } + var audioTracks = fromStream.getAudioTracks(); + if (audioTracks.length == 0) { + return; + } else { + toStream.addTrack(audioTracks[0]); + } + return toStream; +} + +function createIframe() { + console.log("creating iframe"); + var iframe = document.createElement("iframe"); + iframe.setAttribute("class", "embed-responsive-item"); + iframe.setAttribute("id", "proctored-iframe-" + objectId); + iframe.addEventListener("load", function(e) { + // Prevent loops of iframes: bring the iframe to the + // start when we detect it would land on the very URL + // of this page + var parentURL = location.href + location.search; + var iframeURL = this.contentWindow.location.href + this.contentWindow.location.search; + if (parentURL == iframeURL) { + this.src = objectURL; + } + console.log("iframe loaded"); + }); + document.querySelector("#proctored-iframe-placeholder").appendChild(iframe); + iframe.src = objectURL + console.log("iframe created"); +} + +function createPreview() { + var style; + var e = document.querySelector("#preview-placeholder"); + style = !hasPreview ? "position:absolute;top:0;left:0;" : ""; + e.setAttribute("style", style); + var canvas = document.createElement("canvas"); + style = hasPreview ? "height: 30px; width: 40px" : "height: 1px; width: 1px"; + canvas.setAttribute("style", style); + canvas.setAttribute("id", "audio-preview"); + e.appendChild(canvas); + new AudioWave(proctoring.streams[0], "#audio-preview"); + for (var i = 0; i < proctoring.videos.length; i++) { + var video = proctoring.videos[i]; + var width = hasPreview ? 30 : 1; + video.setAttribute("height", width); + e.appendChild(video); + } +} + +var uploadQueue = []; +function scheduleUpload(name, type, blob) { + if (type == "image" && blob == null && blob.size <= 8000) { + if (name == "camera") { + modalAlert(blackPictureCameraMessage); + } else { + modalAlert(blackPictureDesktopMessage); + } + } + var formData = new FormData(); + formData.append("name", name); + formData.append("type", type); + formData.append("object_id", objectId); + formData.append("file", blob); + uploadQueue.push(formData); +} + +function upload() { + if (uploadQueue.length > 0) { + var formData = uploadQueue.shift(); + var request = new XMLHttpRequest(); + request.timeout = 10000; + request.addEventListener("readystatechange", function () { + if (this.readyState == 4) { + if(this.status == 200) { + if (this.response == "OK") { + setTimeout(upload); + } else { + location.href = objectURL; + } + } else { + uploadQueue.unshift(formData); + setTimeout(upload, 10000); + } + } + }); + request.addEventListener("timeout", function () { + uploadQueue.unshift(formData); + setTimeout(upload, 10000); + }); + request.open("POST", uploadURL); + request.send(formData); + } else { + setTimeout(upload, 1000); + } +} + +function approveStartExam() { + valid = false; + clearError(); + var formData = new FormData(); + formData.append("object_id", objectId); + var request = new XMLHttpRequest(); + request.timeout = 10000; + request.addEventListener("readystatechange", function () { + if (this.readyState == 4) { + if(this.status == 200) { + if (this.response == "OK") { + valid = true; + } else { + location.href = objectURL; + } + } else { + var errmsg = requestFailedMessage; + setError(errmsg); + setTimeout(approveStartExam, 10000); + } + } + }); + request.addEventListener("timeout", function () { + var errmsg = requestTimedOutMessage; + setError(errmsg); + setTimeout(approveStartExam, 10000); + }); + request.open("POST", examinationStatementURL); + request.send(formData); +} + + +var currentTab = 0; // Current tab is set to be the first tab (0) +var recheckBtn = document.querySelector("#retryBtn"); +recheckBtn.addEventListener("click", function(e) { + recheck(currentTab); +}); +document.querySelector("#prevBtn").addEventListener("click", function(e) { + nextPrev(-1); +}); +document.querySelector("#nextBtn").addEventListener("click", function(e) { + nextPrev(1); +}); + +var examinationStatement = document.querySelector('#examination-statement'); +var deskvideo = document.querySelector('#desktop'); +var camvideo = document.querySelector('#webcam'); +var audio = document.querySelector('#audio'); + +var streams = []; +var handlers = []; +if (hasExaminationStatement) { + handlers.push(approveStartExam); +} +if (hasProctoring) { + handlers.push(function () { + clearError(); + valid = false; + var errmsg; + if (isMobile) { + errmsg = mobileDevicesNotSupportedMessage; + } else if (!navigator.mediaDevices.getUserMedia) { + errmsg = cameraGrabbingNotSupportedMessage; + } else if (!navigator.mediaDevices.getDisplayMedia) { + errmsg = desktopGrabbingNotSupportedMessage; + } else { + valid = true + } + if (!valid) { + setError(errmsg); + } + }); + handlers.push(function () { + valid = false; + clearError(); + var constraints = { + audio: cameraConstraints.audio + }; + navigator.mediaDevices.getUserMedia(constraints).then(stream => { + if (streamMuted(stream)) { + throw yourMicrophoneIsMutedMessage; + } else { + camvideo.srcObject = stream; + new AudioWave(stream, "#audio"); + valid = true; + streams[0] = stream; + } + }).catch(err => { + if (err.name == "NotAllowedError") { + err = microphonePermissionDeniedMessage; + } else if (err.name == "NotFoundError") { + err = microphoneNotFoundMessage; + } else if (err.name == "NotReadableError") { + err = microphoneNotReadableMessage; + } + setError(err); + }); + }); + handlers.push(function () { + valid = false; + clearError(); + var constraints = { + video: cameraConstraints.video + }; + navigator.mediaDevices.getUserMedia(constraints).then(stream => { + camvideo.srcObject = stream; + camvideo.style.display = "block"; + streams[1] = stream; + camvideo.addEventListener("play", function() { + var canvas = document.createElement("canvas"); + canvas.width = camvideo.videoWidth; + canvas.height = camvideo.videoHeight; + canvas.getContext("2d").drawImage(camvideo, 0, 0, camvideo.videoWidth, camvideo.videoHeight); + canvas.toBlob(function(blob) { + if (blob == null || blob.size < 24000) { + var errmsg = blackPictureCameraMessage; + setError(errmsg); + } + }, "image/jpeg"); + }); + valid = true; + }).catch(err => { + if (err.name == "NotAllowedError") { + err = cameraPermissionDeniedMessage; + } else if (err.name == "NotFoundError") { + err = cameraNotFoundMessage; + } else if (err.name == "NotReadableError") { + err = cameraNotReadableMessage; + } + setError(err); + }); + }); + handlers.push(function () { + valid = false; + clearError(); + var constraints = { + video: desktopConstraints.video + }; + navigator.mediaDevices.getDisplayMedia(constraints).then(stream=> { + var requestedStream = constraints.video.displaySurface; + var selectedStream = stream.getVideoTracks()[0].getSettings().displaySurface; + // If user requested for a specific displaysurface + // and browser supports it, also check that the + // one selected is right. + if (requestedStream == undefined || + (selectedStream != undefined && + requestedStream == selectedStream)) { + deskvideo.srcObject = stream; + deskvideo.style.display = "block"; + valid = true; + streams[2] = stream; + } else { + if (selectedStream != undefined) { + throw wrongDisplaySurfaceSelectedMessage; + } else { + throw displaySurfaceNotSupportedMessage; + } + } + }).catch(err => { + if (err.name == "NotAllowedError") { + err = desktopPermissionDeniedMessage; + } + setError(err); + }); + }); +} + +function showTab(n) { + // This function will display the specified tab of the form... + var x = document.getElementsByClassName("tab"); + x[n].style.display = "block"; + if (typeof handlers[n] == "function") { + handlers[n](); + } else { + valid = true; + } + //... and fix the Previous/Next buttons: + if (n == 0) { + document.getElementById("prevBtn").style.display = "none"; + } else { + document.getElementById("prevBtn").style.display = "inline"; + } + if (n == (x.length - 1)) { + document.getElementById("nextBtn").innerHTML = submitLabel; + } else { + document.getElementById("nextBtn").innerHTML = nextLabel; + } + //... and run a function that will display the correct step indicator: + fixStepIndicator(n) +} + +var errorEl = document.querySelector("#error-message"); +function clearError() { + errorEl.innerHTML = ""; + retryBtn.style.display = "none"; +} + +function setError(errmsg) { + // console.error(errmsg); + errorEl.innerHTML = errmsg; + valid = false; + retryBtn.style.display = "inline"; +} + +function recheck(n) { + handlers[n](); +} + +function nextPrev(n) { + // This function will figure out which tab to display + var x = document.getElementsByClassName("tab"); + // Exit the function if any field in the current tab is invalid: + if (n == 1 && !validateForm()) return false; + // Hide the current tab: + x[currentTab].style.display = "none"; + // Increase or decrease the current tab by 1: + currentTab = currentTab + n; + // if you have reached the end of the form... + if (currentTab >= x.length) { + // ... the form gets submitted: + // location.href = ""; + //document.getElementById("regForm").submit(); + startExam(); + return false; + } + // Otherwise, display the correct tab: + showTab(currentTab); +} + +// Retreiving the stream happens asynchronously +var valid = false; +function validateForm() { + // If the valid status is true, mark the step as finished and valid: + if (valid) { + document.getElementsByClassName("step")[currentTab].className += " finish"; + } + return valid; // return the valid status +} + +function fixStepIndicator(n) { + // This function removes the "active" class of all steps... + var i, x = document.getElementsByClassName("step"); + for (i = 0; i < x.length; i++) { + x[i].className = x[i].className.replace(" active", ""); + } + //... and adds the "active" class on the current step: + x[n].className += " active"; +} + +var cameraConstraints = { + video: { + width: { max: 640 }, + height: { max: 480 } + }, + audio: true +}; +var desktopConstraints = { + video: { + width: 1280, + height: 960, + displaySurface: "monitor" + } +}; + +var audioHandlers; +if (hasAudio) { + audioHandlers = { + auto: function(blob) { + scheduleUpload("camera", "audio", blob); + } + }; +} + +function startExam() { + document.querySelector("#wizard").style.display = "none"; + document.querySelector("#proctoring").style.display = "block"; + + var cameraStream = embedAudioTrackFromStream(streams[0], streams[1]); + var desktopStream = streams[2]; + var conf = { + minMsInterval: minMsInterval, + maxMsInterval: maxMsInterval, + minAudioDuration: minAudioDuration, + maxAudioDuration: maxAudioDuration, + onMissingStreamHandler : function(streamName, errMsg) { + modalAlert(missingStreamMessage, function() { + location.reload(); + }); + }, + onReadyHandler: function() { + createIframe(); + createPreview(); + }, + mediaConf: { + camera: { + required: true, + grayscale: true, + width: 320, + height: 240, + imageHandlers: { + jpeg: { + blob: function(blob) { + scheduleUpload("camera", "image", blob); + } + } + }, + audioHandlers: audioHandlers, + stream: cameraStream + }, + desktop: { + required: true, + grayscale: false, + imageHandlers: { + jpeg: { + blob: function(blob) { + scheduleUpload("desktop", "image", blob); + } + } + }, + stream: desktopStream + } + } + }; + + if (hasProctoring) { + console.log("creating proctoring"); + proctoring = new Proctoring(conf); + console.log("starting proctoring"); + proctoring.start(); + console.log("starting upload"); + upload(); + console.log("proctoring has started"); + } else { + createIframe(); + console.log("proctoring not requested"); + } +} + +window.addEventListener("load", function() { + document.querySelector("#proctoring").style.display = "none"; + document.querySelector("#wizard").style.display = "block"; + showTab(currentTab); // Display the current tab +}); Index: openacs-4/packages/proctoring-support/www/resources/proctoring.js =================================================================== RCS file: /usr/local/cvsroot/openacs-4/packages/proctoring-support/www/resources/Attic/proctoring.js,v diff -u -N --- /dev/null 1 Jan 1970 00:00:00 -0000 +++ openacs-4/packages/proctoring-support/www/resources/proctoring.js 10 Aug 2020 13:23:42 -0000 1.1.2.1 @@ -0,0 +1,807 @@ +// A class to implement lightweight student "proctoring" on +// browser-based applications. +// +// It works by grabbing audio or video input devices in this way: +// 1. audio - when an audio device is grabbed, any noise detected for +// longer than a certain time time threshold will be made +// into an opus-encoded webm audio file and passed to +// configured callbacks +// 2. video - snapshots of configured video devices are captured from +// the browser at random intervals in one or more of +// configured image formats specified and passed to +// configured callbacks +// +// Video capture supports still frame images (as jpeg or png) and also +// animated gifs created by concatenating all frames collected so +// far. Every image is automatically watermarked with current +// timestamp and can be configured to be converted to grayscale +// (useful to generate smaller files at the expense of colors). +// +// Dependencies: gif.js (http://jnordberg.github.io/gif.js/) (only to +// generate animated gifs) +// +// Author: Antonio Pisano (antonio@elettrotecnica.it) +// +// Usage: to start a proctored session, create a new Proctoring +// instance by by passing a configuration object to it. +// +// ## General Configuration Object Attributes ## +// +// - minMsInterval: applies to video stream grabbing. Min +// time interval to pass between two consecutive +// snapshots in milliseconds +// - maxMsInterval: applies to video stream grabbing. Max +// time interval to pass between twp consecutive +// snapshots in milliseconds +// - minAudioDuration: when audio is recorded, any noisy interval +// longer than this number of seconds will be +// transformed in an audio file +// - maxAudioDuration: when audio is recorded, recordings longer than +// this number of seconds will be stopped +// automatically so that no recordings will be +// longer than this value +// - onMissingStreamHandler: this javascript handler will be triggered +// when one on more of the required streams +// becomes unavailable during the proctoring +// session (e.g. user disconnects the +// camera, or other error +// condition). Receives in input +// 'streamName', name of the failing stream +// and 'errMsg', returned error message. +// - onMicrophoneTooLowHandler: this javascript handler will be +// triggered when audio signal (currently +// coming only from the camera due to +// browsers limitations) detects too +// little noise for the microphone to be +// actually working. Takes no argument. +// - onReadyHandler: this javascript handler is triggered as soon as +// the user has given access to all necessary input +// streams so that the proctored session can +// start. Does not receive any argument. +// - mediaConf: a JSON object that can have up to two attributes, +// 'camera', or 'desktop', to define the proctoring +// behavior and multimedia input configuration for the +// two kinds of devices. Each attribute's value is also a +// JSON object, see ("Media Configuration Attributes" section). +// +// ## Media Configuration Attributes ## +// +// Each attribute 'camera' or 'desktop' from mediaConf supports the +// following attributes: +// - required: this boolean flag decides if the stream is required and +// if proctoring should fail whenever this is not +// available. Defaults to false +// - grayscale: boolean flag deciding if the captured images from this +// stream should be converted to grayscale. Defaults to +// false. +// - width / height: forced width and height. Tells the proctoring +// object that images from this stream should be +// forcefully rescaled to this size regardless of +// the size they were captured. Needed to create +// lower resolution images from devices +// (e.g. webcams) that cannot produce images this +// small, as some Apple cameras. +// - imageHandlers: a JSON object defining the handlers to be +// triggered whenever a new image from this stream is +// available. It supports 3 possible attributes, each +// named after the corresponding image type, 'png', +// 'jpeg' and 'gif'. The presence of one of such +// attributes will enable the generation of an image +// in that type whenever a new snapshot is +// taken. Each attribue supports itself two possible +// attributes defining the handler type, 'blob', or +// 'base64'. The value of each of those is a +// javascript handler that expects to receive the +// blob or base64 value respectively of the generated +// image. +// - audioHandlers: a JSON object currently supporting just one 'auto' +// attribute. The value of this attribute is a +// javascript handler called whenever a new audio +// recording is available, receiving the blob file +// containing the audio recording. When this +// attribute is missing, audio will not be recorded. +// - stream: one can specify a ready to use MediaStream for camera or +// desktop. In this case, the acquiring of the device will +// be completely skipped and the stream will be assumed to +// comply with any user constraint +// - constraints: a MediaTrackConstraints JSON object defining the +// real multimedia constraints for this device. See +// https://developer.mozilla.org/en-US/docs/Web/API/MediaTrackConstraints +// +// Example conf: +// +// var conf = { +// minMsInterval: 5000, +// maxMsInterval: 10000, +// minAudioDuration: 1, +// maxAudioDuration: 60, +// onMissingStreamHandler : function(streamName, errMsg) { +// alert("'" + streamName + "' stream is missing. Please reload the page and enable all mandatory input sources."); +// }, +// onReadyHandler : function(streamName, errMsg) { +// console.log("All set!"); +// }, +// mediaConf: { +// camera: { +// required: true, +// grayscale: true, +// width: 320, +// width: 240, +// imageHandlers: { +// gif: { +// base64: function(base64data) { +// // this handler will be triggered +// // everytime a new gif for this stream is +// // rendered and will receive the base64 +// // data in input +// var input = document.querySelector('input[name="proctoring1"]'); +// input.value = base64data; +// } +// }, +// png { +// blob: function(blob) { +// pushImageToServer(blob); +// } +// } +// }, +// audioHandlers = { +// auto: function(blob) { +// // Do something with the audio blob +// } +// }, +// constraints: { +// video: { +// width: { max: 640 }, +// height: { max: 480 } +// } +// } +// }, +// desktop: { +// required: true, +// grayscale: false, +// imageHandlers: { +// gif: { +// base64: function(base64data) { +// // this handler will be triggered +// // everytime a new gif for this stream is +// // rendered and will receive the base64 +// // data in input +// var input = document.querySelector('input[name="proctoring1"]'); +// input.value = base64data; +// } +// }, +// png { +// base64: ...some handler +// blob:... +// }, +// jpeg { +// ... +// ... +// } +// }, +// constraints: { +// video: { +// width: { max: 640 }, +// height: { max: 480 } +// } +// } +// } +// } +// }; +// var proctoring = new Proctoring(conf); +// proctoring.start(); +// + +Date.prototype.toTZISOString = function() { + var tzo = -this.getTimezoneOffset(), + dif = tzo >= 0 ? '+' : '-', + pad = function(num) { + var norm = Math.floor(Math.abs(num)); + return (norm < 10 ? '0' : '') + norm; + }; + return this.getFullYear() + + '-' + pad(this.getMonth() + 1) + + '-' + pad(this.getDate()) + + 'T' + pad(this.getHours()) + + ':' + pad(this.getMinutes()) + + ':' + pad(this.getSeconds()) + + dif + pad(tzo / 60) + + ':' + pad(tzo % 60); +} + +// Implements a recorder automatically grabbing audio samples when +// noise is detected for longer than a specified interval +class AutoAudioRecorder { + constructor(stream, + ondataavailable, + onmicrophonetoolow, + minDuration=5, + maxDuration=60, + sampleInterval=50) { + var autorec = this; + this.stream = new MediaStream(); + var audioTracks = stream.getAudioTracks(); + if (audioTracks.length == 0) { + throw "No audio track available in supplied stream"; + } + + // Get only audio tracks from the main stream object + audioTracks.forEach(function(track) { + autorec.stream.addTrack(track); + }); + + this.ondataavailable = ondataavailable; + this.onmicrophonetoolow = onmicrophonetoolow; + this.sampleInterval = sampleInterval; + this.minDuration = minDuration; + this.maxDuration = maxDuration; + this.stopHandle = null; + + // Prepare to sample stream properties + this.audioCtx = new (window.AudioContext || window.webkitAudioContext)(); + this.analyser = this.audioCtx.createAnalyser(); + this.source = this.audioCtx.createMediaStreamSource(this.stream); + this.source.connect(this.analyser); + this.analyser.fftSize = 2048; + this.bufferLength = this.analyser.frequencyBinCount; + this.dataArray = new Uint8Array(this.bufferLength); + + this.numPositiveSamples = 0; + this.noise = 0; + + // Audio frames we skip at the beginning when we check for + // silence, as the audio stream might start with a silent + // interval due to initialization. We skip the first 5s + this.nSkipSilentFrames = 5000 / this.sampleInterval; + + // Create an audio recorder + this.recorder = new MediaRecorder(this.stream, { + mimeType: 'audio/webm' + }); + + this.recorder.addEventListener("dataavailable", function(e) { + if (autorec.currentDuration() >= autorec.minDuration) { + autorec.ondataavailable(e.data); + } + autorec.numPositiveSamples = 0; + }); + } + + currentDuration() { + return (this.sampleInterval * this.numPositiveSamples) / 1000; + } + + someNoise() { + this.analyser.getByteTimeDomainData(this.dataArray); + var max = 0; + for(var i = 0; i < this.bufferLength; i++) { + var v = (this.dataArray[i] - 128.0) / 128.0; + if (v > max) { + max = v; + } + } + var decay = 500 / this.sampleInterval; + this.noise = (this.noise * (decay - 1) + max) / decay; + + if (this.nSkipSilentFrames == 0 && + this.noise < 0.000001) { + if (typeof this.onmicrophonetoolow == "function") { + this.onmicrophonetoolow(); + } + } else if (this.nSkipSilentFrames > 0) { + this.nSkipSilentFrames--; + // console.log("skipping: " + this.nSkipSilentFrames); + } + + return max > 0.01; + } + + silence() { + // console.log(this.noise); + return this.noise <= 0.01 + } + + autoRecord() { + if (this.someNoise()) { + if (this.recorder.state != "recording") { + this.recorder.start(); + } + this.numPositiveSamples++; + } else if (this.recorder.state != "inactive" && + this.silence()) { + this.recorder.stop(); + } + if (this.recorder.state != "inactive" && + this.currentDuration() >= this.maxDuration) { + this.recorder.stop(); + } + } + + start() { + this.stop(); + this.stopHandle = setInterval(this.autoRecord.bind(this), this.sampleInterval); + } + + stop() { + if (this.stopHandle != null) { + clearInterval(this.stopHandle); + } + if (this.recorder.state != "inactive") { + this.recorder.stop(); + } + } +} + +class Proctoring { + + constructor(conf) { + this.minMsInterval = conf.minMsInterval; + this.maxMsInterval = conf.maxMsInterval; + this.minAudioDuration = conf.minAudioDuration; + this.maxAudioDuration = conf.maxAudioDuration; + this.mediaConf = conf.mediaConf; + + this.streamNames = Object.keys(this.mediaConf); + this.numStreams = this.streamNames.length; + this.numCheckedStreams = 0; + this.numActiveStreams = 0; + + this.onReadyHandler = conf.onReadyHandler; + this.ready = false; + this.onMissingStreamHandler = conf.onMissingStreamHandler; + this.onMicrophoneTooLowHandler = conf.onMicrophoneTooLowHandler; + this.isMissingStreams = false; + this.streamErrors = ["", ""]; + + this.gifs = [null, null]; + this.imageHandlers = [null, null]; + this.audioHandlers = [null, null]; + this.pictures = [[], []]; + this.prevPictures = [null, null]; + this.streams = [null, null]; + this.videos = [null, null]; + + for (var i = 0; i < this.numStreams; i++) { + var streamName = this.streamNames[i]; + var conf = this.mediaConf[streamName]; + // streams are not required by default + if (conf.required == undefined) { + conf.required = false; + } + if (conf.imageHandlers != undefined) { + this.imageHandlers[i] = conf.imageHandlers; + } + if (conf.audioHandlers != undefined) { + this.audioHandlers[i] = conf.audioHandlers; + } + if (conf.stream instanceof MediaStream) { + if (streamName == "camera") { + this.useCameraStream(conf.stream); + } else { + this.useDesktopStream(conf.stream); + } + } + } + + this.acquireDevices(); + } + + useCameraStream(stream) { + var i = this.streamNames.indexOf("camera"); + if (this.audioHandlers[i] != null) { + new AutoAudioRecorder(stream, + this.audioHandlers[i].auto, + this.onMicrophoneTooLowHandler, + this.minAudioDuration, + this.maxAudioDuration).start(); + } + this.streams[i] = stream; + this.videos[i] = this.createVideo(stream); + this.numActiveStreams++; + this.numCheckedStreams++; + } + + useDesktopStream(stream) { + var i = this.streamNames.indexOf("desktop"); + this.streams[i] = stream; + this.videos[i] = this.createVideo(stream); + this.numActiveStreams++; + this.numCheckedStreams++; + } + + acquireDevices() { + var proctor = this; + + // Cam stream + if (this.mediaConf.camera != undefined && + this.mediaConf.camera.stream == undefined) { + if (!navigator.mediaDevices.getUserMedia && + !navigator.getUserMedia) { + var err = "getUserMedia not supported"; + proctor.streamErrors[proctor.streamNames.indexOf("camera")] = err; + console.log("Camera cannot be recorded: " + err); + proctor.numCheckedStreams++; + } else { + var camPromise = navigator.mediaDevices.getUserMedia ? + navigator.mediaDevices.getUserMedia(this.mediaConf.camera.constraints) : + navigator.getUserMedia(this.mediaConf.camera.constraints); + camPromise.then(stream => { + this.useCameraStream(stream); + }) + .catch(function (err) { + proctor.streamErrors[proctor.streamNames.indexOf("camera")] = err; + console.log("Camera cannot be recorded: " + err); + if (err.name == 'AbortError') { + proctor.numCheckedStreams = proctor.numStreams; + } else { + proctor.numCheckedStreams++; + } + }); + } + } + + // Desktop stream + if (this.mediaConf.desktop != undefined && + this.mediaConf.desktop.stream == undefined) { + if (!navigator.mediaDevices.getDisplayMedia && + !navigator.getDisplayMedia) { + var err = "getDisplayMedia not supported"; + proctor.streamErrors[proctor.streamNames.indexOf("desktop")] = err; + console.log("Desktop cannot be recorded: " + err); + proctor.numCheckedStreams++; + } else { + var desktopPromise = navigator.mediaDevices.getDisplayMedia ? + navigator.mediaDevices.getDisplayMedia(this.mediaConf.desktop.constraints) : + navigator.getDisplayMedia(this.mediaConf.desktop.constraints); + desktopPromise.then(stream => { + var requestedStream = this.mediaConf.desktop.constraints.video.displaySurface; + var selectedStream = stream.getVideoTracks()[0].getSettings().displaySurface; + // If displaySurface was specified, browser + // MUST support it and MUST be the right one. + if (requestedStream == undefined || + (selectedStream != undefined && + requestedStream == selectedStream)) { + this.useDesktopStream(stream); + } else { + throw "'" + requestedStream +"' was requested, but '" + selectedStream + "' was selected"; + } + }) + .catch(function (err) { + proctor.streamErrors[proctor.streamNames.indexOf("desktop")] = err; + console.log("Desktop cannot be recorded: " + err); + if (err.name == 'AbortError') { + proctor.numCheckedStreams = proctor.numStreams; + } else { + proctor.numCheckedStreams++; + } + }); + } + } + } + + start() { + this.checkMissingStreams(); + this.takePictures(this.minMsInterval, this.maxMsInterval); + } + + reset() { + this.pictures = [[], []]; + } + + streamMuted(stream) { + var muted = false; + var audioTracks = stream.getAudioTracks(); + for (var i = 0; i < audioTracks.length; i++) { + var track = audioTracks[i]; + if (track.muted || + !track.enabled || + track.getSettings().volume == 0) { + muted = true; + break; + } + } + var videoTracks = stream.getVideoTracks(); + for (var i = 0; i < videoTracks.length; i++) { + var track = videoTracks[i]; + if (track.muted || + !track.enabled) { + muted = true; + break; + } + } + return muted; + } + + checkStream(stream, streamName) { + if (stream == null || + !stream.active || + this.streamMuted(stream)) { + if (this.mediaConf[streamName].required) { + return false; + } + } + return true; + } + + checkMissingStreams() { + if (!this.isMissingStreams && + this.numCheckedStreams == this.numStreams) { + for (var i = 0; i < this.streams.length; i++) { + var streamName = this.streamNames[i]; + if (!this.checkStream(this.streams[i], streamName)) { + this.isMissingStreams = true; + if (typeof this.onMissingStreamHandler == 'function') { + var err = this.streamErrors[i]; + this.onMissingStreamHandler(streamName, err); + } + } + } + } + + if (!this.isMissingStreams) { + setTimeout(this.checkMissingStreams.bind(this), 1000); + } + } + + renderGif(frames) { + if (frames.length == 0) { + return; + } + var i = this.pictures.indexOf(frames); + if (this.gifs[i] == null) { + this.gifs[i] = new GIF({ + workers: 2, + quality: 30, + workerScript: Proctoring.webWorkerURL, + width: frames[0].width, + height: frames[0].height + }); + var proctor = this; + var gifs = this.gifs; + gifs[i].on('finished', function(blob) { + var handlers = proctor.imageHandlers[i]; + if (typeof handlers.gif.blob == 'function') { + handlers.gif.blob(blob); + } + if (typeof handlers.gif.base64 == 'function') { + var reader = new FileReader(); + reader.readAsDataURL(blob); + reader.onloadend = function() { + var base64data = reader.result; + handlers.gif.base64(base64data); + } + } + // Stop the workers and kill the gif object + this.abort(); + this.freeWorkers.forEach(w => w.terminate()); + gifs[gifs.indexOf(this)] = null; + }); + } + var gif = this.gifs[i]; + if (!gif.running) { + for (var j = 0; j < frames.length; j++) { + gif.addFrame(frames[j], {delay: 500}); + } + gif.render(); + } + } + + createVideo(stream) { + var video = document.createElement("video"); + video.muted = true; + video.autoplay = "true"; + video.preload = "auto"; + video.srcObject = stream; + video.addEventListener("loadeddata", function(e) { + if (this.paused) { + this.play(); + } + }); + // Try to force that video is never put to sleep + video.addEventListener("pause", function(e) { + this.play(); + }); + + return video; + } + + watermark(canvas, text) { + var ctx = canvas.getContext("2d"); + var fontSize = 0.032*canvas.width; + ctx.font = "10px monospace" ; + ctx.fillStyle = "white"; + ctx.strokeStyle = "black"; + ctx.lineWidth = 0.5; + var metrics = ctx.measureText(text); + var x = canvas.width - metrics.width; + var y = canvas.height - fontSize; + ctx.fillText(text, x, y); + ctx.strokeText(text, x, y); + } + + canvasToGrayscale(canvas) { + var ctx = canvas.getContext("2d"); + var imageData = ctx.getImageData(0, 0, canvas.width, canvas.height); + var data = imageData.data; + for (var i = 0; i < data.length; i += 4) { + var avg = (data[i] + data[i + 1] + data[i + 2]) / 3; + data[i] = avg; // red + data[i + 1] = avg; // green + data[i + 2] = avg; // blue + } + ctx.putImageData(imageData, 0, 0); + } + + isCanvasMonochrome(canvas) { + var ctx = canvas.getContext("2d"); + var imageData = ctx.getImageData(0, 0, canvas.width, canvas.height); + var data = imageData.data; + var isMonochrome = true; + var firstPx = []; + for (var i = 0; i < data.length; i += 4) { + if (i == 0) { + firstPx[0] = data[i]; + firstPx[1] = data[i+1]; + firstPx[2] = data[i+2]; + } else if (firstPx[0] != data[i] || + firstPx[1] != data[i+1] || + firstPx[2] != data[i+2]) { + isMonochrome = false; + break; + } + } + + return isMonochrome; + } + + areCanvasEquals(canvas1, canvas2) { + var ctx1 = canvas1.getContext("2d"); + var imageData1 = ctx1.getImageData(0, 0, canvas1.width, canvas1.height); + var data1 = imageData1.data; + var ctx2 = canvas2.getContext("2d"); + var imageData2 = ctx2.getImageData(0, 0, canvas2.width, canvas2.height); + var data2 = imageData2.data; + var areEquals = true; + for (var i = 0; i < data1.length; i += 4) { + if (data1[i] != data2[i] || + data1[i+1] != data2[i+1] || + data1[i+2] != data2[i+2]) { + areEquals = false; + break; + } + } + + return areEquals; + } + + + takeShot(stream, grayscale) { + var i = this.streams.indexOf(stream); + var video = this.videos[i]; + + if (!video.paused) { + var streamName = this.streamNames[i]; + var conf = this.mediaConf[streamName]; + // var height = stream.getVideoTracks()[0].getSettings().height; + // var width = stream.getVideoTracks()[0].getSettings().width; + var iHeight = conf.height == undefined ? video.videoHeight : conf.height; + var iWidth = conf.width == undefined ? video.videoWidth : conf.width; + var proctor = this; + var pictures = this.pictures[i]; + var prevPicture = this.prevPictures[i]; + + var canvas = document.createElement("canvas"); + canvas.width = iWidth; + canvas.height = iHeight; + canvas.getContext("2d").drawImage(video, 0, 0, iWidth, iHeight); + + // In the future we might be stricter about black pictures... + // if (this.isCanvasMonochrome(canvas)) { + // var err = "canvas is monochrome"; + // this.onMissingStreamHandler(streamName, err); + // return; + // } + + // Check that camera does not keep sending the same + // picture over and over. + if (streamName == "camera" && + prevPicture != null && + this.areCanvasEquals(canvas, prevPicture)) { + var err = "camera is stuck"; + this.onMissingStreamHandler(streamName, err); + return; + } + this.prevPictures[i] = canvas; + + if (grayscale) { + this.canvasToGrayscale(canvas); + } + + this.watermark(canvas, (new Date()).toTZISOString()); + + var handlers = proctor.imageHandlers[i]; + if (handlers != null) { + if (handlers.png != undefined) { + canvas.toBlob(function(blob) { + if (typeof handlers.png.blob == 'function') { + handlers.png.blob(blob); + } + if (typeof handlers.png.base64 == 'function') { + var reader = new FileReader(); + reader.readAsDataURL(blob); + reader.onloadend = function() { + var base64data = reader.result; + handlers.png.base64(base64data); + } + } + }, "image/png"); + } + if (handlers.jpeg != undefined) { + canvas.toBlob(function(blob) { + if (typeof handlers.jpeg.blob == 'function') { + handlers.jpeg.blob(blob); + } + if (typeof handlers.jpeg.base64 == 'function') { + var reader = new FileReader(); + reader.readAsDataURL(blob); + reader.onloadend = function() { + var base64data = reader.result; + handlers.jpeg.base64(base64data); + } + } + }, "image/jpeg"); + } + if (handlers.gif != undefined) { + pictures.push(canvas); + proctor.renderGif(pictures); + } + } + } + } + + takePictures(minMsInterval, maxMsInterval) { + var interval; + if (!this.isMissingStreams && + this.numCheckedStreams == this.numStreams && + this.numActiveStreams > 0) { + // User already gave access to all requested streams and + // proctoring has successfully started + if (!this.ready) { + // If this is the first picture we take for this + // session, take not of this and trigger the onReady + // handler + if (typeof this.onReadyHandler == 'function') { + this.onReadyHandler(); + } + this.ready = true; + } + // For every configured stream, take a picture + for (var i = 0; i < this.streams.length; i++) { + if (this.streams[i] != null) { + this.takeShot(this.streams[i], this.mediaConf[this.streamNames[i]].grayscale); + } + } + // Set the time to the next snapshot to a random interval + interval = (Math.random() * (maxMsInterval - minMsInterval)) + minMsInterval; + } else { + // Not all streams are available and we cannot take + // snapshots (yet?). Set interval one second from now. + interval = 1000; + console.log("Waiting for streams: " + this.numCheckedStreams + "/" + this.numStreams + " ready."); + } + if (!this.isMissingStreams) { + // No errors, reschedule this function for the computed + // interval + setTimeout(this.takePictures.bind(this), interval, minMsInterval, maxMsInterval); + } else { + // Something went wrong and proctoring cannot proceed + console.log("Stopping..."); + } + } +} + +// We need this trick to get the folder of this very script and build +// from there the URL to the gif worker. +var scripts = document.querySelectorAll("script"); +var loc = scripts[scripts.length - 1].src; +Proctoring.webWorkerURL = loc.substring(0, loc.lastIndexOf('/')) + "/gif.worker.js";