When data is sent over the Internet, each unit transmitted includes both header information and the actual data being sent. The header identifies the source and destination of the packet, while the actual data is referred to as the payload. Because header information, or overhead data, is only used in the transmission process, it is stripped from the packet when it reaches its destination. Therefore, the payload is the only data received by the destination system.
Posts made by Julene
OS X is Apples operating system that runs on Macintosh computers. It was first released in 2001 and over the next few years replaced Mac OS 9 (also known as Mac OS Classic) as the standard OS for Macs. It was called Mac OS X until version OS X 10.8, when Apple dropped Mac from the name.
OS X was originally built from NeXTSTEP, an operating system designed by NeXT, which Apple acquired when Steve Jobs returned to Apple in 1997. Like NeXTSTEP, OS X is based on Unix and uses the same Mach kernel. This kernel provides OS X with better multithreading capabilities and improved memory management compared to Mac OS Classic. While the change forced Mac developers to rewrite their software programs, it provided necessary performance improvements and scalability for future generations of Macs.
The OS X desktop interface is called the Finder and includes several standard features. OS X does not have a task bar like Windows, but instead includes a menu bar, which is fixed at the top of the screen. The menu bar options change depending on what application is currently running and is only hidden when full screen mode is enabled. The Finder also includes a Dock, which is displayed by default on the bottom of the screen. The Dock provides easy one-click access to frequently used applications and files. The Finder also displays a user-selectable desktop background that serves as a backdrop for icons and open windows.
When you start up a Mac, OS X loads automatically. It serves as the fundamental user interface, but also works behind the scenes, managing processes and applications. For example, when you double-click an application icon, OS X launches the corresponding program and provides memory to the application while it is running. It reallocates memory as necessary and frees up used memory when an application is quit. OS X also includes an extensive API, or library of functions, that developers can use when writing Mac programs.
While the OS X interface remains similar to the original version released in 2001, it has gone through several updates, which have each added numerous new features to the operating system. Below is a list of the different versions of OS X, along with their code names.
Stands for Optical Character Recognition. This technology is what allows you to scan that paper you lost on your hard drive, but fortunately printed out, back into your computer. When a page of text is scanned into a computer without OCR software, all the computer sees is a bunch graphical bits, or an image. In other words, it has no idea that there is text on the page, much less what the text says. However, an OCR program can convert the characters on the page into a text document that can be read by a word processing program. More advanced OCR programs can even keep the formatting of the document in the conversion.
Newline is a character that marks the end of a line of text. As the name implies, it is used to create a new line in a text document, database field, or any other block of text.
When typing in a word processor, you can enter a newline character by pressing the Enter or Return key on your keyboard. This creates a line break (also known as a carriage return or line feed) in the text and moves the cursor to the beginning of the next line. When a line break occurs at the end of a block of text, it is called a trailing newline.
The newline character is important in computer programming, since it allows programmers to search for line breaks in text files. For example, if a data file lists one element per line, the items can be delimited by newline characters. In most modern programming languages, the newline character is represented by either \n or \r. Some databases, like MySQL, store line breaks using a combination of \r\n. By searching for newline characters in text strings, programmers can parse documents line by line and remove unwanted line breaks.
Like file compression, the goal of media compression is to reduce file size and save disk space. However, media compression algorithms are specific to certain types of media, such as image, audio, and video files.
Most popular image formats use some type of compression. Three of the most common include JPEG, GIF, and PNG. JPEG compression, which is commonly used for digital photos, incorporates a lossy compression algorithm that averages nearby colors and removes color variations that are imperceivable by the human eye. GIF compression reduces the color palette of an image to 256 colors or less, which provides an efficient way to represent each color within the image. PNG compression uses a lossless compression algorithm that filters the image data and predicts pixel colors based on other nearby pixels. While each of these algorithms work in different ways, they can all be used to significantly reduce the file size of an uncompressed image.
Several common audio file formats also use compression to save disk space. Popular audio formats, such as MP3 and M4A use compression algorithms that remove inaudible frequencies and reduce the dynamics of the sound. Since uncompressed audio files, such as AIFF and WAVE files, take up a lot of disk space, they are often compressed as MP3 or M4A files before they are distributed over the Internet. These files are typically around one tenth the size of the original audio files and have nearly identical sound quality.
Most video files are also compressed. Popular video formats, such as MPEG and DivX compress video using a specific codec. Each codec uses a custom algorithm that removes redundant information from the video. For example, if the background of a video doesnt change for awhile, a codec can reduce the file size by not redrawing the background every frame. A video codec may also incorporate audio compression to reduce the size of the audio track. Since encoded videos are decoded as they are played, the codec that was used to encode a video must also be available to decode the video. Therefore, in order to play a compressed video file on your computer, your video player software must have the appropriate codec installed.
Related file extensions: .JPG, .GIF, .PNG, .MP3, .M4A, .MPG, .DIVX.
Stands for HyperText Transport Protocol Secure. HTTPS is the same thing as HTTP, but uses a secure socket layer (SSL) for security purposes. Some examples of sites that use HTTPS include banking and investment websites, e-commerce websites, and most websites that require you to log in.
Websites that use the standard HTTP protocol transmit and receive data in an unsecured manner. This means it is possible for someone to eavesdrop on the data being transferred between the user and the Web server. While this is highly unlikely, it is not a comforting thought that someone might be capturing your credit card number or other personal information that you enter on a website. Therefore, secure websites use the HTTPS protocol to encrypt the data being sent back and forth with SSL encryption. If someone were to capture the data being transferred via HTTPS, it would be unrecognizable.
You can tell if a website is secure by viewing the URL in the address field of your Web browser. If the Web address starts with https://, you know you are accessing a secure website. Most browsers will also display a lock icon somewhere along the edge of the window to indicate the website you are currently visiting is secure. You can click the lock icon to view the secure certificate that authenticates the website.
So whenever you are asked to enter personal or financial information on a website, make sure that the URL starts with https:// and that the lock icon appears in the window. Then you can be sure that the website is secure and any data you enter will only be recognized by your computer and the Web server.
A halftone, or halftone image, is an image comprised of discrete dots rather than continuous tones. When viewed from a distance, the dots blur together, creating the illusion of continuous lines and shapes. By halftoning an image (converting it from a bitmap to a halftone), it can be printed using less ink. Therefore, many newspapers and magazines use halftoning to print pages more efficiently.
Originally, halftoning was performed mechanically by printers that printed images through a screen with a grid of holes. During the printing process, ink passed through the holes in the screen, creating dots on the paper. For monochrome images, only one pass was needed to create an image. For multicolor images, several passes or screens were required.
Todays printers are more advanced and typically do not contain physical screens. Instead, the halftone images are generated by a computer and the resulting image is printed onto the paper. By using a process called dithering, modern printers can randomize the dot patterns, creating a more natural appearance. This produces realistic images using far less ink than fully saturated ones.
Like a standard bitmap, the quality of a halftone image depends largely on the its resolution. A halftone with a high resolution (measured in LPI), will have greater detail than a halftone with a low resolution. While the goal of halftoning is typically to create a realistic image, sometimes low resolutions are used for an artistic effect.
Stands for General-Purpose computation on Graphics Processing Units. GPGPU, or GPU computing, is the use of a GPU to handle general computing operations. Modern operating systems allow programs to access the GPU alongside the CPU, speeding up the overall performance.
While GPUs are designed to process graphics calculations, they can also be used to perform other operations. GPGPU maximizes processing efficiency by offloading some operations from the central processing unit (CPU) to the GPU. Instead of sitting idle when not processing graphics, the GPU is constantly available to perform other tasks. Since GPUs are optimized for processing vector calculations, they can even process some instructions faster than the CPU.
GPGPU is a type of parallel processing, in which operations are processed in tandem between the CPU and GPU. When the GPU finishes a calculation, it may store the result in a buffer, then pass it to the CPU. Since processors can complete millions of operations each second, data is often stored in the buffer only for a few milliseconds.
GPU computing is made possible using a programming language that allows the CPU and GPU share processing requests. The most popular is OpenCL, an open standard supported by multiple platforms and video cards. Others include CUDA (Compute Unified Device Architecture), an API created by NVIDIA, and APP (Accelerated Parallel Processing), an SDK provided by AMD.
A flowchart is a diagram that describes a process or operation. It includes multiple steps, which the process flows through from start to finish. Common uses for flowcharts include developing business plans, defining troubleshooting steps, and designing mathematical algorithms. Some flowcharts may only include a few steps, while others can be highly complex, containing hundreds of possible outcomes.
Flowcharts typically use standard symbols to represent different stages or actions within the chart. For example, each step is shown within a rectangle, while each decision is displayed in a diamond. Arrows are placed between the different symbols to show the direction the process is flowing. While flowcharts can be created with a pen and paper, there are several software programs available that make designing flowcharts especially easy. Common programs that can be used to create flowcharts include SmartDraw and Visio for Windows and OmniGraffle for the Mac.