The UNIX batch Utility

13.4 Batch Processing

Batch processing is a special way to run nonurgent but long−lasting and CPU−intensive programs. Such programs can run at off−peak times, when a system is not busy with the execution of other higher priority programs. Off−peak time is usually during the night, when it is not too convenient for users to start their programs. The batch utility provides a way to execute the programs submitted with a lower priority at any off−peak time, giving the system a chance to balance its CPU loading. A job scheduled by the batch utility is known as a batch−job; it is equivalent to an at−job that is submitted into the b queue for an immediate run with lower priority. Therefore, the at utility can be also used to schedule a batch−job: at −q b −m now However, it is more convenient to use the given batch utility for batch processing, and this is the usual method on the UNIX platform. From a system standpoint, batch processing is a very useful and economical method of program execution. System administrators should encourage users to utilize it. Even though the batch utility is easy to use, users often do not know very much about this possibility. Sometimes they create an additional burden on the CPU at the most critical times, provoking an unnecessary substantial degradation in system performance. Batch processing can also be an economical way to perform a number of administrative tasks.

13.4.1 The UNIX batch Utility

The generic form of the batch utility is: batch command 1 command 2 ..... [Ctrl−D] The batch utility reads the standard input until the terminating EOF character [Ctrl−D] from the keyboard. It then submits the entered command sequence into the batch queue for immediate execution with a low priority. The command entered could be any UNIX command, a script, any executable program, or a combination thereof. The Here Document is also the most convenient way to implement the batch utility in shell script programming, as in the following example: 317 To schedule the batch−job batch EOF The sequence of commands ..... EOF The batch−job was scheduled 318 Section II: Network Administration Chapter List Chapter 14: Network Fundamentals Chapter 15: TCPIP Network Chapter 16: Domain Name System Chapter 17: Network Information Service NIS Chapter 18: Network File System NFS Chapter 19: UNIX Remote Commands Chapter 20: Electronic Mail Chapter 21: UNIX Network Support 319

Chapter 14: Network Fundamentals

14.1 UNIX and Networking

One of the greatest advantages of the UNIX system is its inherent network−related structure. From its very beginnings, UNIX included a number of network−based characteristics that made it quite different from other existing operating systems. At a time when network technologies were in the very early stages, UNIX already provided certain network services and powerful tools to cope with network issues between remote hosts. From a network standpoint, the concept of UNIX was so well done that it allowed an easy integration of UNIX into network technologies. It is even more appropriate to say that UNIX and networking merged, making UNIX the core operating system in the new emerging network environment. Today, even after so many years of intensive commercial use, UNIX is still far from being considered an obsolete operating system. UNIX was the first commercially successful and available network−oriented OS, and UNIXs use in networked environments was perhaps the biggest factor leading to the end of the supremacy of mainframe computers and gigantic OSs. Despite its advancing age, UNIX is still the leading OS, offering more than any other OS alone, and permanently keeping pace with newcomers. The primary advantages of UNIX are its openness and flexibility, which make it suitable for almost any kind of upgrade. Most of these upgrades were made in the network arena, which makes sense, given the incredible advances in the field of networking. However, this flexibility and UNIXs ability to integrate so many changes only prove the sound conceptual approach that UNIX designers had while creating UNIX. Regardless of where the credit should go, UNIXs main contribution to the overall development of computer technologies was, and still is, in networking; it is fair to say that the network−oriented UNIX concept practically enabled the tremendous growth of networking technologies. Networks have grown so prolifically because they provide an important service: to share information among users. Computers generate and process information that is often useless unless it is shared among a group of people; the network is the vehicle that enables data to be easily shared. Once a computer has been networked, users will likely not want to return to an isolated system. Such a trend does not stop at the local level; forming a local network and cooperating with neighboring computers lead to global, worldwide networking. Today, this global network is known under the generic name Internet, which is named after what was once the worlds largest experimental network. Computer networking has brought new challenges and duties to system administrators. It is not enough to simply maintain the systems; the network requires a great deal of ongoing work. This issue is very important, because it affects not only a single system, but also other systems on the network. A familiarity with basic theoretical issues will make this job easier, and that is the purpose of this chapter and those following.

14.2 Computer Networks

A computer network is a communication system that connects end computers, usually referred to as hosts. The hosts can range in size from small microcomputers to the largest supercomputers. From a network point of view, a host is any machine participating in network communication, independent of its basic function and configuration single−user or multi−user, general purpose systems, dedicated servers, terminals, any kind of client, etc.. 320