It was the native file format of the old Autodesk 3D Studio DOS (releases 1 to 4), which was popular until its successor (3D Studio MAX 1.0) replaced it in April 1996. Having been around since 1990 (when the first version of 3D Studio DOS was launched), it has grown to become a de facto industry standard for transferring models between 3D programs, or for storing models for 3D resource catalogs (along with OBJ, which is more frequently used as a model archiving file format).[1]

While the 3DS format aims to provide an import/export format, retaining only essential geometry, texture and lighting data, the related MAX format (now superseded by the PRJ format[citation needed]) also contains extra information specific to Autodesk 3ds Max, to allow a scene to be completely saved/loaded.

Contents

The format is based in chunks, where each section of data is embedded in a block that contains a chunk identifier and the length of the data (to provide the location of the next main block), as well as the data itself, this allows parsers to skip chunks they don't recognize, and allows for extensions to the format.

The chunks form a hierarchical structure, similar to an xml DOM tree, the first two bytes of the chunk are its ID. From that value the parser can identify the chunk and decide whether it will parse it or skip it, the next four bytes contain a little-endian integer that is the length of the chunk, including its data, the length of its sub-blocks and the 6-byte header. The next bytes are the chunk's data, followed by the sub-chunks, in a structure that may extend to several levels deep.

Below is a list of the most common IDs for chunks, represented in a hierarchical fashion depicting their dependencies:[2][3]

Accurate vertex normals cannot be stored in the .3ds file. Instead "smoothing groups"[note 1] are used so that the receiving program can recreate a (hopefully good) representation of the vertex normals, this is still a hold-over legacy for many animation programs today which started in the 1980s (3DS MAX, Lightwave and trueSpace still use smoothing groups, and Maya did up to v2.51).

^Smoothing groups (read about surfaces on Elements of Mesh Modeling for the rationale of their use) are stored as a bit field, with 4 bytes (a long int) for each face, thus allowing up to 32 (4×8) smoothing groups per face.

1.
Nintendo 3DS
–
The Nintendo 3DS is a portable game console produced by Nintendo. It is capable of displaying stereoscopic 3D effects without the use of 3D glasses or additional accessories, Nintendo announced the device in March 2010 and officially unveiled it at E32010 on June 15,2010. The console succeeds the Nintendo DS, featuring compatibility with older Nintendo DS. Its primary competitor is the PlayStation Vita from Sony, the Nintendo 3DS was first released in Japan on February 26,2011, and worldwide beginning in March 2011. Less than six months later on July 28,2011, Nintendo announced a significant price reduction from US$249 to US$169 amid disappointing launch sales. The company offered ten free Nintendo Entertainment System games and ten free Game Boy Advance games from the Nintendo eShop to consumers who bought the system at the launch price. This strategy was considered a success, and the console has gone on to become one of Nintendos most successfully sold handheld consoles in the first two years of its release. As of September 30,2016, the Nintendo 3DS family of systems combined have sold 61.57 million units. Several redesigns have been made since, the Nintendo 3DS XL, an entry-level version of the console, the Nintendo 2DS, with a fixed slate form factor and lacking autostereoscopic functionality, was released in Western markets in October 2013. Nintendo began experimenting with stereoscopic 3D video game technology in the 1980s, the Famicom 3D System, an accessory consisting of liquid crystal shutter glasses, was Nintendos first product that enabled stereoscopic 3D effects. Although very few titles were released, Nintendo helped design one—called Famicom Grand Prix II, 3D Hot Rally—which was co-developed by Nintendo and HAL Laboratory, the Famicom 3D System failed to garner market interest and was never released outside Japan. Despite the limited success, Nintendo would press ahead with 3D development into the 1990s, gunpei Yokoi, creator of the Game Boy handheld device and popular Metroid video game, developed a new 3D device for Nintendo called the Virtual Boy. It was a portable system consisting of goggles and a controller that used a spinning disc to achieve full stereoscopic monochrome 3D. Released in 1995, the Virtual Boy sold fewer than a million units, spawning only 22 compatible game titles, the failure of the Virtual Boy left many at Nintendo doubting the viability of 3D gaming. Despite this, Nintendo continued to investigate the incorporation of 3D technology into other products, the GameCube, released in 2001, is another 3D-capable system. With an LCD attachment, it could display true stereoscopic 3D, due to the expensive nature of the requisite peripheral technology at the time, the GameCubes 3D functionality was never marketed to the public. Nintendo later experimented with a 3D LCD during development of the Game Boy Advance SP, another attempt was made in preparation for a virtual navigation guide to be used on the Nintendo DS at Shigureden, an interactive museum in Japan. Nintendo president Hiroshi Yamauchi encouraged additional 3D research in an effort to use the technology in the exhibition, although the project fell short, Nintendo was able to collect valuable research on liquid crystal which would later aid in the development of the Nintendo 3DS

2.
Hexadecimal
–
In mathematics and computing, hexadecimal is a positional numeral system with a radix, or base, of 16. It uses sixteen distinct symbols, most often the symbols 0–9 to represent values zero to nine, Hexadecimal numerals are widely used by computer system designers and programmers. As each hexadecimal digit represents four binary digits, it allows a more human-friendly representation of binary-coded values, one hexadecimal digit represents a nibble, which is half of an octet or byte. For example, a byte can have values ranging from 00000000 to 11111111 in binary form. In a non-programming context, a subscript is typically used to give the radix, several notations are used to support hexadecimal representation of constants in programming languages, usually involving a prefix or suffix. The prefix 0x is used in C and related languages, where this value might be denoted as 0x2AF3, in contexts where the base is not clear, hexadecimal numbers can be ambiguous and confused with numbers expressed in other bases. There are several conventions for expressing values unambiguously, a numerical subscript can give the base explicitly,15910 is decimal 159,15916 is hexadecimal 159, which is equal to 34510. Some authors prefer a text subscript, such as 159decimal and 159hex, or 159d and 159h. example. com/name%20with%20spaces where %20 is the space character, thus &#x2019, represents the right single quotation mark, Unicode code point number 2019 in hex,8217. In the Unicode standard, a value is represented with U+ followed by the hex value. Color references in HTML, CSS and X Window can be expressed with six hexadecimal digits prefixed with #, white, CSS allows 3-hexdigit abbreviations with one hexdigit per component, #FA3 abbreviates #FFAA33. *nix shells, AT&T assembly language and likewise the C programming language, to output an integer as hexadecimal with the printf function family, the format conversion code %X or %x is used. In Intel-derived assembly languages and Modula-2, hexadecimal is denoted with a suffixed H or h, some assembly languages use the notation HABCD. Ada and VHDL enclose hexadecimal numerals in based numeric quotes, 16#5A3#, for bit vector constants VHDL uses the notation x5A3. Verilog represents hexadecimal constants in the form 8hFF, where 8 is the number of bits in the value, the Smalltalk language uses the prefix 16r, 16r5A3 PostScript and the Bourne shell and its derivatives denote hex with prefix 16#, 16#5A3. For PostScript, binary data can be expressed as unprefixed consecutive hexadecimal pairs, in early systems when a Macintosh crashed, one or two lines of hexadecimal code would be displayed under the Sad Mac to tell the user what went wrong. Common Lisp uses the prefixes #x and #16r, setting the variables *read-base* and *print-base* to 16 can also used to switch the reader and printer of a Common Lisp system to Hexadecimal number representation for reading and printing numbers. Thus Hexadecimal numbers can be represented without the #x or #16r prefix code, MSX BASIC, QuickBASIC, FreeBASIC and Visual Basic prefix hexadecimal numbers with &H, &H5A3 BBC BASIC and Locomotive BASIC use & for hex. TI-89 and 92 series uses a 0h prefix, 0h5A3 ALGOL68 uses the prefix 16r to denote hexadecimal numbers, binary, quaternary and octal numbers can be specified similarly

3.
ASCII
–
ASCII, abbreviated from American Standard Code for Information Interchange, is a character encoding standard. ASCII codes represent text in computers, telecommunications equipment, and other devices, most modern character-encoding schemes are based on ASCII, although they support many additional characters. ASCII was developed from telegraph code and its first commercial use was as a seven-bit teleprinter code promoted by Bell data services. Work on the ASCII standard began on October 6,1960, the first edition of the standard was published in 1963, underwent a major revision during 1967, and experienced its most recent update during 1986. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting of lists, and added features for other than teleprinters. Originally based on the English alphabet, ASCII encodes 128 specified characters into seven-bit integers as shown by the ASCII chart above. The characters encoded are numbers 0 to 9, lowercase letters a to z, uppercase letters A to Z, basic punctuation symbols, control codes that originated with Teletype machines, for example, lowercase j would become binary 1101010 and decimal 106. ASCII includes definitions for 128 characters,33 are non-printing control characters that affect how text and space are processed and 95 printable characters, of these, the IANA encourages use of the name US-ASCII for Internet uses of ASCII. The ASA became the United States of America Standards Institute and ultimately the American National Standards Institute, there was some debate at the time whether there should be more control characters rather than the lowercase alphabet. The X3.2.4 task group voted its approval for the change to ASCII at its May 1963 meeting, the X3 committee made other changes, including other new characters, renaming some control characters and moving or removing others. ASCII was subsequently updated as USAS X3. 4-1967, then USAS X3. 4-1968, ANSI X3. 4-1977 and they proposed a 9-track standard for magnetic tape, and attempted to deal with some punched card formats. The X3.2 subcommittee designed ASCII based on the earlier teleprinter encoding systems, like other character encodings, ASCII specifies a correspondence between digital bit patterns and character symbols. This allows digital devices to communicate each other and to process, store. Before ASCII was developed, the encodings in use included 26 alphabetic characters,10 numerical digits, ITA2 were in turn based on the 5-bit telegraph code Émile Baudot invented in 1870 and patented in 1874. The committee debated the possibility of a function, which would allow more than 64 codes to be represented by a six-bit code. In a shifted code, some character codes determine choices between options for the character codes. It allows compact encoding, but is reliable for data transmission. The standards committee decided against shifting, and so ASCII required at least a seven-bit code, the committee considered an eight-bit code, since eight bits would allow two four-bit patterns to efficiently encode two digits with binary-coded decimal

4.
Autodesk
–
Autodesk, Inc. is an American multinational software corporation that makes software for the architecture, engineering, construction, manufacturing, media, and entertainment industries. Autodesk is headquartered in San Rafael, California, and features a gallery of its customers work in its San Francisco building, the company was founded in 1982 by John Walker, a coauthor of the first versions of AutoCAD, the companys flagship computer-aided design software. Its AutoCAD and Revit software is used by architects, engineers, and structural designers to design, draft. Autodesk software has been used in fields, from the New York Freedom Tower to Tesla electric cars. The companys Revit line of software for Building Information Modeling is designed to let users explore the planning, construction, Autodesks Media and Entertainment division creates software for visual effects, color grading, and editing as well as animation, game development, and design visualization. 3ds Max and Maya are both 3D animation software used in visual effects and game development. Autodesk Suites, Subscription and Web Services, which includes Autodesk Cloud, Autodesk Labs, in what was seen as an unusual step for a maker of high-end business software, Autodesk began offering AutoCAD LT2012 for Mac through the Apple Mac App Store. The products from the group include 123D, Fluid FX, Homestyler, Pixlr, users range from children, students and artists to makers and DIYers. The Architecture, Engineering and Construction industry group is headquartered in Waltham, Massachusetts, in a LEED Platinum building designed, Autodesks architecture, engineering, and construction solutions include AutoCAD-based design and documentation software such as AutoCAD Architecture, AutoCAD MEP, and AutoCAD Civil 3D. Their flagship product for relational Building information modeling is Revit, Revit is available as Revit Architecture, Revit Structure, Revit MEP or an all-in-one product. Projects that have used software from the Autodesk AEC division include the NASA Ames building, the San Francisco Bay Bridge, the Shanghai Tower, Autodesks manufacturing industry group is headquartered in Lake Oswego, Oregon. Autodesk’s Media and Entertainment Division is based in Montreal, Quebec and it was established in 1999 after Autodesk, Inc. acquired Discreet Logic, Inc. and merged its operations with Kinetix. In January 2006, Autodesk acquired Alias, a developer of 3D graphics technology, in October 2008, Autodesk acquired the Softimage brand from Avid. The principal product offerings from the Media and Entertainment Division are the Autodesk Entertainment Creation Suites, which include Maya, Softimage, 3ds Max, Mudbox, Smoke, Flame, much of Avatars visual effects were created with Autodesk media and entertainment software. Autodesk software enabled Avatar director James Cameron to aim a camera at actors wearing motion-capture suits in a studio, in 2011, Autodesk acquired a cloud-based set of image tools and utilities called Pixlr. Autodesk develops and purchased many specific-purpose renderers but many Autodesk products are bundled with third-party renderers such as NVIDIA MentalRay or Iray, Autodesk Raytracer - a simple pathtracing renderer which is based on Opticore technology. Maya Vector - a vector based on Electric Rains RAViX technology. The problem with this part of Autodesks history is that it was a time of discovery in computer graphics, in this sense Lightscape was more than just another product, it was an essential part of the development of rendering technology generally, and part of its evolution

5.
Autodesk 3ds Max
–
Autodesk 3ds Max, formerly 3D Studio, then 3D Studio Max is a professional 3D computer graphics program for making 3D animations, models, games and images. It is developed and produced by Autodesk Media and Entertainment and it has modeling capabilities and a flexible plugin architecture and can be used on the Microsoft Windows platform. It is frequently used by game developers, many TV commercial studios. It is also used for effects and movie pre-visualization. The original 3D Studio product was created for the DOS platform by Gary Yost and the Yost Group, the release of 3D Studio made Autodesks previous 3D rendering package AutoShade obsolete. After 3D Studio DOS Release 4, the product was rewritten for the Windows NT platform and this version was also originally created by the Yost Group. It was released by Kinetix, which was at that time Autodesks division of media, Autodesk purchased the product at the second release update of the 3D Studio MAX version and internalized development entirely over the next two releases. Later, the name was changed to 3ds max to better comply with the naming conventions of Discreet. When it was re-released, the product was branded with the Autodesk logo. MAXScript MAXScript is a scripting language that can be used to automate repetitive tasks, combine existing functionality in new ways, develop new tools and user interfaces. Plugin modules can be created entirely within MAXScript, Character Studio Character Studio was a plugin which since version 4 of Max is now integrated in 3D Studio Max, it helps users to animate virtual characters. The system works using a rig or Biped skeleton which has stock settings that can be modified and customized to fit the character meshes. This tool also includes robust editing tools for IK/FK switching, Pose manipulation, Layers and Keyframing workflows and these Biped objects have other useful features that help accelerate the production of walk cycles and movement paths, as well as secondary motion. Scene Explorer Scene Explorer, a tool provides a hierarchical view of scene data and analysis. Scene Explorer has the ability to sort, filter, and search a scene by any type or property. Added in 3ds Max 2008, it was the first component to facilitate. NET managed code in 3ds Max outside of MAXScript, DWG import 3ds Max supports both import and linking of DWG files. Improved memory management in 3ds Max 2008 enables larger scenes to be imported with multiple objects, the texture workflow includes the ability to combine an unlimited number of textures, a material/map browser with support for drag-and-drop assignment, and hierarchies with thumbnails. General keyframing Two keying modes — set key and auto key — offer support for different keyframing workflows, fast and intuitive controls for keyframing — including cut, copy, and paste — let the user create animations with ease

6.
Binary file
–
A binary file is a computer file that is not a text file. The term binary file is used as a term meaning non-text file. Binary files are usually thought of as being a sequence of bytes, binary files typically contain bytes that are intended to be interpreted as something other than text characters. Compiled computer programs are examples, indeed, compiled applications are sometimes referred to, particularly by programmers. But binary files can also mean that they contain images, sounds, compressed versions of files, etc. — in short. Some binary files contain headers, blocks of metadata used by a program to interpret the data in the file. The header often contains a signature or magic number which can identify the format, for example, a GIF file can contain multiple images, and headers are used to identify and describe each block of image data. The leading bytes of the header would contain text like GIF87a or GIF89a that can identify the binary as a GIF file, if a binary file does not contain any headers, it may be called a flat binary file. To send binary files through certain systems that do not allow all data values, encoding the data has the disadvantage of increasing the file size during the transfer, as well as requiring translation back into binary after receipt. See Binary-to-text encoding for more on this subject, a hex editor or viewer may be used to view file data as a sequence of hexadecimal values for corresponding bytes of a binary file. If a binary file is opened in an editor, each group of eight bits will typically be translated as a single character. Other type of viewers simply replace the characters with spaces revealing only the human-readable text. This type of view is useful for inspection of a binary file in order to find passwords in games, find hidden text in non-text files. It can even be used to inspect suspicious files for unwanted effects, for example, the user would see any URL/email to which the suspected software may attempt to connect in order to upload unapproved data. If the file is itself treated as an executable and run, standards are very important to binary files. For example, a binary file interpreted by the ASCII character set will result in text being displayed, a custom application can interpret the file differently, a byte may be a sound, or a pixel, or even an entire word. Binary itself is meaningless, until such time as an executed algorithm defines what should be done with each bit, byte, thus, just examining the binary and attempting to match it against known formats can lead to the wrong conclusion as to what it actually represents. This fact can be used in steganography, where an algorithm interprets a binary data file differently to reveal hidden content, without the algorithm, it is impossible to tell that hidden content exists

7.
Document Object Model
–
The objects can be manipulated programmatically and any visible changes occurring as a result may then be reflected in the display of the document. Principal standardization of DOM was handled by the W3C, which last developed a recommendation in 2004, WHATWG took over development of the standard, publishing it as a living document. The W3C now publishes stable snapshots of the WHATWG standard, JavaScript was released by Netscape Communications in 1995 within Netscape Navigator 2.0. Netscapes competitor, Microsoft, released Internet Explorer 3.0 the following year with a port of JavaScript called JScript, JavaScript and JScript let web developers create web pages with client-side interactivity. The limited facilities for detecting user-generated events and modifying the HTML document in the first generation of these eventually became known as DOM Level 0 or Legacy DOM. No independent standard was developed for DOM Level 0, but it was described in the specification of HTML4. Legacy DOM was limited in the kinds of elements that could be accessed, form, link and image elements could be referenced with a hierarchical name that began with the root document object. A hierarchical name could make use of either the names or the index of the traversed elements. For example, an input element could be accessed as either document. formName. inputName or document. forms. elements. The Legacy DOM enabled client-side form validation and the popular rollover effect, DHTML required extensions to the rudimentary document object that was available in the Legacy DOM implementations. These versions of the DOM became known as the Intermediate DOM, after the standardization of ECMAScript, the W3C DOM Working Group began drafting a standard DOM specification. The completed specification, known as DOM Level 1, was recommended by W3C in late 1998, by 2005, large parts of W3C DOM were well-supported by common ECMAScript-enabled browsers, including Microsoft Internet Explorer version 6, Opera, Safari and Gecko-based browsers. The W3C DOM Working Group published its recommendation and subsequently disbanded in 2004. Development efforts migrated to the WHATWG which continues to maintain a living standard, in 2009, the Web Applications group reorganized DOM activities at the W3C. In 2013, due to a lack of progress and the release of HTML5. Meanwhile, in 2015 the Web Applications group was disbanded and DOM stewardship passed to the Web Platform group, beginning with the publication of DOM Level 4 in 2015, the W3C creates new recommendations based on snapshots of the WHATWG standard. DOM Level 1 provided a model for an entire HTML or XML document. DOM Level 2 was published in late 2000 and it introduced the getElementById function as well as an event model and support for XML namespaces and CSS

8.
Endianness
–
Endianness refers to the sequential order used to numerically interpret a range of bytes in computer memory as a larger, composed word value. It also describes the order of transmission over a digital link. Little-endian format reverses the order of the sequence and stores the least significant byte at the first location with the most significant byte being stored last. The order of bits within a byte can also have endianness, however, both big and little forms of endianness are widely used in digital electronics. As examples, the IBM z/Architecture mainframes use big-endian while the Intel x86 processors use little-endian, the designers chose endianness in the 1960s and 1970s respectively. Big-endian is the most common format in data networking, fields in the protocols of the Internet protocol suite, such as IPv4, IPv6, TCP, for this reason, big-endian byte order is also referred to as network byte order. Little-endian storage is popular for microprocessors, in due to significant influence on microprocessor designs by Intel Corporation. Mixed forms also exist, for instance the ordering of bytes in a 16-bit word may differ from the ordering of 16-bit words within a 32-bit word, such cases are sometimes referred to as mixed-endian or middle-endian. There are also some bi-endian processors that operate in either little-endian or big-endian mode, big-endianness may be demonstrated by writing a decimal number, say one hundred twenty-three, on paper in the usual positional notation understood by a numerate reader,123. The digits are written starting from the left and to the right, with the most significant digit,1 and this is analogous to the lowest address of memory being used first. This is an example of a big-endian convention taken from daily life, the little-endian way of writing the same number, one hundred twenty-three, would place the hundreds-digit 1 in the right-most position,321. A person following conventional big-endian place-value order, who is not aware of this ordering, would read a different number. Endianness in computing is similar, but it applies to the ordering of bytes. The illustrations to the right, where a is a memory address, danny Cohen introduced the terms Little-Endian and Big-Endian for byte ordering in an article from 1980. Computer memory consists of a sequence of storage cells, each cell is identified in hardware and software by its memory address. If the total number of cells in memory is n. Computer programs often use data structures of fields that may consist of data than is stored in one memory cell. For the purpose of this article where its use as an operand of an instruction is relevant, in addition to that, it has to be of numeric type in some positional number system

9.
Triangle mesh
–
A triangle mesh is a type of polygon mesh in computer graphics. It comprises a set of triangles that are connected by their common edges or corners, many graphics software packages and hardware devices can operate more efficiently on triangles that are grouped into meshes than on a similar number of triangles that are presented individually. This is typically because computer graphics do operations on the vertices at the corners of triangles, with individual triangles, the system has to operate on three vertices for every triangle. In many computer graphics applications it is necessary to manage a mesh of triangles, the mesh components are vertices, edges, and triangles. An application might require knowledge of the connections between the mesh components. These connections can be managed independently of the actual vertex positions and this document describes a simple data structure that is convenient for managing the connections. This is not the only possible data structure, many other types exist and have support for various queries about meshes. Various methods of storing and working with a mesh in computer memory are possible, with the OpenGL and DirectX APIs there are two primary ways of passing a triangle mesh to the graphics hardware, triangle strips and index arrays. One way of sharing data between triangles is the triangle strip. With strips of triangles each triangle shares one complete edge with one neighbour, another way is the triangle fan which is a set of connected triangles sharing one central vertex. With these methods vertices are dealt with efficiently resulting in the need to only process N+2 vertices in order to draw N triangles, triangle strips are efficient, however the drawback is that it may not be obvious how or convenient to translate an arbitrary triangle mesh into strips. The data structure representing the mesh provides support for two operations, inserting triangles and removing triangles. It also supports an edge collapse operation that is useful in triangle decimation schemes, a mesh vertex is defined by a single integer and is denoted by hvi. A mesh edge is defined by a pair of integers hv0, v1i, to support edge maps, the edges are stored so that v0 = min. A triangle component is defined by a triple of integers hv0, v1, v2i, to support triangle maps, the triangles are stored so that v0 = min. Observe that hv0, v1, v2i and hv0, v2, an application requiring double–sided triangles must insert both triples into the data structure. For the sake of avoiding constant reminders about order of indices, connectivity between the components is completely determined by the set of triples representing the triangles. A triangle t = hv0, v1, v2i has vertices v0, v1 and it has edges e0 = hv0, v1i, e1 = hv1, v2i, and e2 = hv2, v0i

10.
Vertex normal
–
Commonly, it is computed as the normalized average of the surface normals of the faces that contain that vertex. The average can be weighted for example by the area of the face or it can be unweighted, vertex normals can also be computed for polygonal approximations to surfaces such as NURBS, or specified explicitly for artistic purposes. Vertex normals are used in Gouraud shading, Phong shading and other lighting models, using vertex normals, much smoother shading than flat shading can be achieved, however, without some modifications, it cannot produce a sharp edge

11.
Autodesk Maya
–
It is used to create interactive 3D applications, including video games, animated film, TV series, or visual effects. Maya was originally a next-generation animation product based on code from The Advanced Visualizer by Wavefront Technologies, PowerAnimator by Alias Research, Inc. the code was ported to IRIX and animation features were added, the porting project codename was Maya. Walt Disney Feature Animation collaborated closely with Mayas development during its production of Dinosaur, Disney requested that the User interface of the application be customizable so that a personalized workflow could be created. This was an influence in the open architecture of Maya. After Silicon Graphics Inc. acquired both Alias and Wavefront Technologies, Inc, Wavefronts next-generation technology was merged into Maya. SGIs acquisition was a response to Microsoft Corporation acquiring Softimage, Co, the new wholly owned subsidiary was named Alias|Wavefront. In the early days of development, Maya started with Tcl as the scripting language, but after the merger with Wavefront, Sophia, the scripting language in Wavefronts Dynamation, was chosen as the basis of MEL. Maya 1.0 was released in February 1998, following a series of acquisitions, Maya was bought by Autodesk in 2005. Under the name of the new parent company, Maya was renamed Autodesk Maya, however, the name Maya continues to be the dominant name used for the product. In 2005, while working for Alias|Wavefront, Jos Stam shared an Academy Award for Technical Achievement with Edwin Catmull and Tony DeRose for their invention and application of subdivision surfaces. On February 8,2008 Duncan Brinsmead, Jos Stam, Julia Pakalns and Martin Werner received an Academy Award for Technical Achievement for the design, the software was initially released for the IRIX operating system. However, this support was discontinued in August 2006 after the release of version 6.5, Maya was available in both Complete and Unlimited editions until August 2008, when it was turned into a single suite. Users define a virtual workspace to implement and edit media of a particular project, scenes can be saved in a variety of formats, the default being. mb. Maya exposes a node graph architecture, scene elements are node-based, each node having its own attributes and customization. As a result, the representation of a scene is based entirely on a network of interconnecting nodes. For the convenience of viewing these networks, there is a dependency, users who are students, teachers can download a full educational version from the Autodesk Education community. The versions available at the community are only licensed for non commercial use, the software comes with a full 36 month license. Once it expires, users can log in to the community to request a new 36 months license, since its consolidation from two distinct packages, Maya and later contain all the features of the now defunct Unlimited suites

12.
Polygon mesh
–
A polygon mesh is a collection of vertices, edges and faces that defines the shape of a polyhedral object in 3D computer graphics and solid modeling. The study of polygon meshes is a large sub-field of computer graphics, different representations of polygon meshes are used for different applications and goals. The variety of operations performed on meshes may include Boolean logic, smoothing, simplification, volumetric meshes are distinct from polygon meshes in that they explicitly represent both the surface and volume of a structure, while polygon meshes only explicitly represent the surface. As polygonal meshes are used in computer graphics, algorithms also exist for ray tracing, collision detection. Objects created with polygon meshes must store different types of elements and these include vertices, edges, faces, polygons and surfaces. In many applications, only vertices, edges and either faces or polygons are stored, a renderer may support only 3-sided faces, so polygons must be constructed of many of these, as shown above. However, many renderers either support quads and higher-sided polygons, or are able to convert polygons to triangles on the fly, also, in certain applications like head modeling, it is desirable to be able to create both 3- and 4-sided polygons. Vertex A position along with information such as color, normal vector. Edge A connection between two vertices, Face A closed set of edges, in which a triangle face has three edges, and a quad face has four edges. A polygon is a set of faces. In systems that support multi-sided faces, polygons and faces are equivalent, however, most rendering hardware supports only 3- or 4-sided faces, so polygons are represented as multiple faces. Mathematically a polygonal mesh may be considered an unstructured grid, or undirected graph, with properties of geometry, shape. Surfaces More often called smoothing groups, are useful, but not required to group smooth regions, consider a cylinder with caps, such as a soda can. For smooth shading of the sides, all surface normals must point horizontally away from the center, while the normals of the caps must point straight up, rendered as a single, Phong-shaded surface, the crease vertices would have incorrect normals. Thus, some way of determining where to cease smoothing is needed to group smooth parts of a mesh, as an alternative to providing surfaces/smoothing groups, a mesh may contain other data for calculating the same data, such as a splitting angle. Additionally, very high resolution meshes are less subject to issues that would require smoothing groups, further, another alternative exists in the possibility of simply detaching the surfaces themselves from the rest of the mesh. Renderers do not attempt to smooth edges across noncontiguous polygons, materials Generally materials will be defined, allowing different portions of the mesh to use different shaders when rendered. It is also possible for meshes to contain other such vertex attribute information such as colour, tangent vectors, weight maps to control animation, polygon meshes may be represented in a variety of ways, using different methods to store the vertex, edge and face data

13.
Amir Taaki
–
Amir Taaki is a British-Iranian computer programmer who is best known for his leading role in the bitcoin project, and for pioneering many open source projects. Forbes listed Taaki in their top 30 entrepreneurs of 2014, Amir Taaki was born 6 February 1988 in London, the eldest of three children of a Scottish-English mother and an Iranian father who is a property developer. From an early age Taaki took an interest in computer technology, after briefly attending three British universities, Taaki gravitated to the free software movement. Taaki assisted in the creation of SDL Collide, an extension of Simple DirectMedia Layer, in 2006, Taaki became heavily involved in Crystal Space development under the pseudonym of genjix. He also developed a number of video games making use of software, including the adventure game Crystal Core. Taaki was also a participant in the Blender project Yo Frankie, Taaki was a speaker at the 2007 Games Convention in Leipzig. In 2009 and 2010, Taaki made his living as a poker player. His experience with online gambling attracted him to the bitcoin project, in April 2011, Taaki and Donald Norman established the Bitcoin Consultancy, a group focused on bitcoin project development. Taaki created the first full reimplementation of the bitcoin protocol named libbitcoin, worked on the bitcoin client Electrum and created other command line utilities around bitcoin, the bitcoin standardisation procedure was started by Taaki. In 2014, together with Cody Wilson, he launched the Dark Wallet project after a run on IndieGoGo which raised over $50,000. Taaki has been outspoken in favour of Internet activism such as Anonymous, a long-time contributor to free software, he advocates total data freedom. Taaki has labelled censorship policies as being a wedge towards ever-increasing censorship and he proposes a shift away from specialist thinking towards a creative society of generalist knowledge workers. Taaki is a speaker of Esperanto, which he promotes as an auxiliary country-neutral international language to local languages. He writes that Esperanto serves to break barriers and help the flow of media across cultural boundaries. Amir Taaki formerly lived in an anarchist squat in Barcelona, Spain and he now resides in an anarchist squat in the former anti-G8 HQ building in London, England. In 2015, Taaki went to Rojava to offer his tech skills and he had no training, but spent three and a half months in the YPG military. He was then discharged and helped design the curriculum in Rojava. Personal website Interview with Gavin Andresen and Amir Taaki, Bitcoin, This Week in Startups, video on YouTube Bitcoin, the fastest growing currency in the world – video, The Guardian