Cytoscape.js This is how easy it is to get started with Cytoscape.js (this code creates the instance you see on the bottom-right: About Cytoscape.js is an open-source graph theory library written in JavaScript. You can use Cytoscape.js for graph analysis and visualisation. Cytoscape.js allows you to easily display and manipulate rich, interactive graphs. Cytoscape.js also has graph analysis in mind: The library contains a slew of useful functions in graph theory. Cytoscape.js is an open-source project, and anyone is free to contribute. The library was developed at the Donnelly Centre at the University of Toronto. Cytoscape.js & Cytoscape Though Cytoscape.js shares its name with Cytoscape, Cytoscape.js is not Cytoscape. Cytoscape.js is a JavaScript library: It gives you a reusable graph widget that you can integrate with the rest of your webapp with your own JavaScript code. Funding Funding for Cytoscape.js and Cytoscape is provided by NRNB (U.S. Architecture & API Notation Position Elements JSON Background
HTML5 canvas - an introduction to it by Richard Heyes, RGraph author Introduction <canvas> is a new HTML tag which is part of the HTML5 standard. It allows bitmap drawing which is controlled using JavaScript (ie you draw on the <canvas> using JavaScript), and is what the RGraph libraries use to draw the charts. You could liken it to a piece of paper which is part of your page, on to which you can draw. The <canvas> tag uses a "fire and forget" drawing methodology - there is no DOM that is maintained, so if you want to alter something you'll probably (but not necessarily) have to redraw the entire canvas. Other uses for <canvas> include providing a control panel to your users and using it to create games. History of the tag HTML5 canvas was originally introduced by Apple in 2004 for use in Mac OS X WebKit to power dashboard applications and their Safari web browser. A usage example The example to the right is a very simple example of drawing a few primitives on the canvas. Javascript & web charts Canvas compared to SVG
arbor.js » introduction about arbor Arbor is a graph visualization library built with web workers and jQuery. Rather than trying to be an all-encompassing framework, arbor provides an efficient, force-directed layout algorithm plus abstractions for graph organization and screen refresh handling. It leaves the actual screen-drawing to you. As a result, the code you write with it can be focused on the things that make your project unique – the graph data and your visual style – rather than spending time on the physics math that makes the layouts possible. installation To use the particle system, get jQuery and add the file at lib/arbor.js to your path somewhere and include them in your HTML: If you want to let arbor handle realtime color and value tweens for you, include the arbor-tween.js file as well. this will add a pair of new tweening methods to the ParticleSystem object (see the docs to decide if this appeals to you or not). getting started Contribute license Arbor is released under the MIT license. colophon nodes p x
Html5 node graph Introducción al NDK de Android El NDK de Android es un conjunto de herramientas que permiten embeber código máquina nativo compilado en lenguajes C y/o C++, hoy veremos cómo crear un ejemplo en el NDK de Android. Conceptos básicos del NDK La Máquina Virtual de Android (VM) permite que el código de la aplicación (escrito en Java) llame a métodos implementados en código nativo a través de JNI. En una nutshell, lo cual quiere decir que: El código fuente de la aplicación declarará uno o más métodos con la palabra reservada native para indicar que dicho método está implementado en código nativo. Ej: native byte[] loadFile(String filePath); Es necesario proporcionar una biblioteca compartida nativa que contenga la implentación de dichos métodos, que será empaquetada en el .apk de la aplicación. static { System.loadLibrary("FileLoader");} No hay que scribir el prefijo “lib” ni el sufijo “.so”. Primer ejemplo en el NDK Android – Hola Mundo unimplementedStringFromJni() es una función no implementada por la biblioteca hello-jni.
How can I take a screenshot/image of a website using Python 5. Embedding Python in Another Application The previous chapters discussed how to extend Python, that is, how to extend the functionality of Python by attaching a library of C functions to it. It is also possible to do it the other way around: enrich your C/C++ application by embedding Python in it. Embedding provides your application with the ability to implement some of the functionality of your application in Python rather than C or C++. Embedding Python is similar to extending it, but not quite. So if you are embedding Python, you are providing your own main program. There are several different ways to call the interpreter: you can pass a string containing Python statements to PyRun_SimpleString(), or you can pass a stdio file pointer and a file name (for identification in error messages only) to PyRun_SimpleFile(). A simple demo of embedding Python can be found in the directory Demo/embed/ of the source distribution. See also Python/C API Reference Manual The details of Python’s C interface are given in this manual. 5.1.
vpxEncodingGuide – FFmpeg libvpx is the VP8 video encoder for WebM, an open, royalty-free media file format. This guide is an attempt to summarize the most important options for creating video with libvpx. To install FFmpeg with support for libvpx, look at the Compilation Guides and compile FFmpeg with the --enable-libvpx option. Note that in the below examples, the libvorbis audio encoder is used. Make sure your FFmpeg version also includes libvorbis (check with ffmpeg -codecs), as the native Vorbis encoder from FFmpeg does not provide comparable quality. Variable Bitrate ¶ libvpx offers a variable bitrate mode by default. ffmpeg -i input.mp4 -c:v libvpx -b:v 1M -c:a libvorbis output.webm Choose a higher bit rate if you want better quality. In addition to the "default" VBR mode, there's a constant quality mode (like in the x264 encoder) that will ensure that every frame gets the number of bits it deserves to achieve a certain quality level, rather than forcing the stream to have an average bit rate.
stream-m - A HTML5 compatible WebM live streaming server stream.m is created to be an open source solution for streaming live video right into the web browser using the HTML5 video tag and Google's WebM video format. The current version is a working prototype, which showcases the main ideas. The main design goal is low resource usage. Has a web interface with a realtime bandwidth monitor (with the resolution of 1/10 of a second) for spotting network congestion. Also supports simultaneous streams (channels). Note: the recommended ffmpeg options changed (again). The live stream consists of fragments (self-contained units of frames without referencing any frame outside the fragment). The ideal fragment size is around 200 kBytes (or 1600 kbits). e.g. if you are publishing a 500 kbit stream with 16 fps, then: 1600 / 500 * 16 = 51.2 (or 1600000 / 500000 * 16 = 51.2) so every 52nd video frame should be a keyframe. The server splits fragments when it seems necessary. All operations are done over HTTP. java StreamingServer <configfile>
AWS Offers NVIDIA GRID - Performance for Cloud Game Hosting - Hosting Journalist Amazon Web Services (AWS) is now offering NVIDIA GRID technology through its newly announced Amazon Elastic Compute Cloud (Amazon EC2) G2 instance, delivering GPU acceleration to users running graphics-intensive applications and hosting games in the cloud. This expands the uses of cloud computing from storage, data processing and 2D applications to 3D, fully GPU-accelerated, interactive consumer and professional applications. With NVIDIA GRID GPUs, Software-as-a-Service (SaaS) companies can now build cloud-based offerings with extreme graphics performance for design, visualization, media creation, games and more. “AWS sees a growing benefit for adding GPUs to our cloud,” said Matt Wood, general manager of Data Science at AWS. “The NVIDIA GRID GPUs in our new G2 instances enable graphical applications to be rendered in the AWS cloud and streamed to a world with increasing internet bandwidth and proliferation of device types.” Amazon Machine Image (AMI)