background preloader

Reddit

Reddit
Reddit /ˈrɛdɪt/,[6] stylized as reddit,[7] is an entertainment, social networking, and news website where registered community members can submit content, such as text posts or direct links. Registered users can then vote submissions "up" or "down" to organize the posts and determine their position on the site's pages. Content entries are organized by areas of interest called "subreddits." Reddit was founded by University of Virginia roommates Steve Huffman and Alexis Ohanian. Overview Site The site is a collection of entries submitted by its registered users, essentially a bulletin board system. The site's content is divided into numerous categories, and 50 such categories, or "default subreddits," are visible on the front page to new users and those who browse the site without logging in to an account. When items (links or text posts) are submitted to a subreddit, users (redditors) can vote for or against them (upvote/downvote). Users Subreddits IAmA and AMA History Technology Demographics

VMware VMware, Inc. is a US software company that provides cloud and virtualization software and services,[2][3][4] and claims to be the first to commercially successfully virtualize the x86 architecture.[5] Founded in 1998, VMware is based in Palo Alto, California. In 2004 it was acquired by and became a subsidiary of EMC Corporation, then on August 14, 2007, EMC sold 15% of the company in a New York Stock Exchange IPO. The company trades under the symbol VMW.[6] History[edit] In 1998, VMware was founded by Diane Greene, Mendel Rosenblum, Scott Devine, Edward Wang and Edouard Bugnion. In 2003, VMware launched VMware Virtual Center, the VMotion, and Virtual SMP technology. 64-bit support appeared in 2004. In August 2007, EMC released 15% of the company's shares in VMware in an initial public offering on the New York Stock Exchange. On September 16, 2008, VMware announced its collaboration with Cisco to provide joint data center solutions. Acquisitions[edit] Litigation[edit] Products[edit]

5 excellent uses of Windows 8 Hyper-V Buried under all of the clamor and kvetching about Windows 8's most obvious features -- Metro! Metro apps! -- is a new addition that hasn't made a lot of headlines: Windows 8's new Hyper-V-powered virtualization functionality. The exact technical name for Hyper-V in Windows 8 is Client Hyper-V. [ Also on InfoWorld: Review: VMware Workstation 9 vs. People may disagree about Windows 8's new surface, pun intended, but there's little arguing that many great things have happened under the hood. An inevitable question is how Client Hyper-V shapes up against stand-alone virtualization platforms such as VMware Workstation and VirtualBox. The biggest reasons to continue using VMware Workstation or VirtualBox would be your existing investment in expertise and familiarity with them. Getting started with Client Hyper-VWhat exactly can be done with Client Hyper-V? Second, Client Hyper-V is not installed by default in Windows 8.

Hyper-V Hyper-V, codenamed Viridian[1] and formerly known as Windows Server Virtualization, is a native hypervisor; it can create virtual machines on x86-64 systems.[2] Starting with Windows 8, Hyper-V supersedes Windows Virtual PC as the hardware virtualization component of the client editions of Windows NT. A server computer running Hyper-V can be configured to expose individual virtual machines to one or more networks. Hyper-V was first released along Windows Server 2008 and became a staple of the Windows Server family ever since. History[edit] A beta version of Hyper-V was shipped with certain x86-64 editions of Windows Server 2008. Microsoft provides Hyper-V through two channels: Part of Windows: Hyper-V is an optional component of Windows Server 2008 and later. Hyper-V Server[edit] Hyper-V Server 2008 was released on October 1, 2008. Hyper-V Server 2008 R2 (an edition of Windows Server 2008 R2) was made available in September 2009 and includes Windows PowerShell v2 for greater CLI control.

Cloud computing Cloud computing metaphor: For a user, the network elements representing the provider-rendered services are invisible, as if obscured by a cloud. Cloud computing is a computing term or metaphor that evolved in the late 1990s, based on utility and consumption of computer resources. Cloud computing involves application systems which are executed within the cloud and operated through internet enabled devices. Purely cloud computing does not rely on the use of cloud storage as it will be removed upon users download action. Overview[edit] Cloud computing[3] relies on sharing of resources to achieve coherence and economies of scale, similar to a utility (like the electricity grid) over a network.[2] At the foundation of cloud computing is the broader concept of converged infrastructure and shared services. Cloud computing, or in simpler shorthand just "the cloud", also focuses on maximizing the effectiveness of the shared resources. History of cloud computing[edit] Origin of the term[edit]

Amazon Web Services Amazon Web Services (AWS) is a collection of remote computing services, also called web services, that make up a cloud computing platform offered by Amazon.com. These services are based out of 11 geographical regions across the world. The most central and well-known of these services are Amazon EC2 and Amazon S3. These products are marketed as a service to provide large computing capacity more quickly and cheaper than a client company building an actual physical server farm.[2] Architecture[edit] Map showing the approximate geographical regions used by Amazon Web Services. AWS is located in 11 geographical "regions": US East (Northern Virginia), where the majority of AWS servers are based,[3] US West (northern California), US West (Oregon), Brazil (São Paulo), Europe (Ireland and Germany), Southeast Asia (Singapore), East Asia (Tokyo and Beijing) and Australia (Sydney). Each Region has multiple "Availability Zones", which are distinct data centers providing AWS services. History[edit]

OpenStack OpenStack is a free and open-source cloud computing software platform.[2] Users primarily deploy it as an infrastructure as a service (IaaS) solution. The technology consists of a series of interrelated projects that control pools of processing, storage, and networking resources throughout a data center—which users manage through a web-based dashboard, command-line tools, or a RESTful API. OpenStack.org released it under the terms of the Apache License. OpenStack began in 2010 as a joint project of Rackspace Hosting and NASA. The OpenStack community collaborates around a six-month, time-based release cycle with frequent development milestones.[13] During the planning phase of each release, the community gathers for the OpenStack Design Summit to facilitate developer working-sessions and to assemble plans.[14] History[edit] In 2012, Red Hat announced a preview of their OpenStack distribution,[22] beginning with the "Essex" release. Components[edit] Compute (Nova)[edit] Database (Trove)[edit]

Enterprise integration Concept of Enterprise Integration. Enterprise integration is a technical field of Enterprise Architecture, which focused on the study of topics such as system interconnection, electronic data interchange, product data exchange and distributed computing environments.[1] It is a concept in Enterprise engineering to provide the right information at the right place and at the right time and thereby enable communication between people, machines and computers and their efficient co-operation and co-ordination.[2] Overview[edit] Requirements and principles deal with determining the business drivers and guiding principles that help in the development of the enterprise architecture. Each functional and non-functional requirement should be traceable to one or more business drivers. Enterprise Integration is focused on optimizing operations in a world which could be considered full of continuous and largely unpredictable change. History[edit] Enterprise integration topics[edit] See also[edit]

Related: