Forums before death by AOL, social media and spammers... "We can't have nice things"
|    alt.conspiracy.princess-diana    |    What really happened to Lady Di...    |    10,071 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 8,542 of 10,071    |
|    frankenstein to All    |
|    Re: Internment of families without crimi    |
|    11 Jan 06 12:49:12    |
      XPost: uk.politics.misc, uk.gov.local, uk.gov.social-security       XPost: alt.politics.british, uk.community.social-housing, uk.com       unity.voluntary       From: @foster.com              A Distributed Objects Overview       A study of the DCOM system that Microsoft is developing Waite Groups p. 323       reveals the problems that Microsoft is trying to solve. A distributed       system is many things to many people so far as the definition is concerned.       Waite Group's book states their definition of a distributed system is p.       323, "a collection of software entities physically spread out across two or       more computers, working together to achieve a goal in common". The       definition here includes a file sharing arrangement, but some computer       programmers would not define a distributed system as that. Yet the Waite       Groups says that whether the computers are aware that files exist on other       computers, not locally, does not matter at all.              There are however, those distributed systems built with explicit knowledge       of a computer network or communications link. These systems are designed       from the beginning so that the system's constituent software components are       executed on separate machines (an object-oriented approach).              DCOM is the outgrowth of the evolving technology called distributed objects.              Initial client server implementations were designed to place some core       application functions such as user interfaces, and simple data validation       routines locally, at the client user's machine. And an interesting       consequence of client/server computing is that the mainframe can be turned       into a sort of super-server, providing database and business logic services       for all of the PC based clients.              The mainframe server is not too far back in its time of usefulness though.       A mainframe is the embodiment Waite's Group p. 324 of centralized computing,       and it is an archetypal example of a nondistributed system. The need for       mainframes has actually increased in spite of the option for distributed       systems. Credit card companies, and insurance firms, and telephone       companies (late 1990's) possess a massive volume of information processsing       need. Only mainframes can be used.              Two, and Three-tier Network Architecture       Client/server computing has been evolving. The two-tier model is based upon       a client having all the user interface functionality, and the business logic       required by the system (fat client). While the server is the second-tier       where the application data is located. Application data is often stored in       some type of a database and is often stewarded by a database server program.              The client applications which may be many in number, each running at its own       client workstation -send requests to the database whenever the application       needed information or data stored in the database (i.e., inventory       database). A two-tier client/server system is still a good fit for a       smaller departmental type of application.              However, the two-tier architecture does suffer from several deficiencies.       For one, each client requires the working interface with much of the       business rules portion of the application. So the application as a       multi-user system is dependent on each client, and the clients can be too       many to service at the level needed for tweaking of the applicaton.              In addition to the upgrade/maintenance issues above, the database itself is       connected to all the clients and changing the structure of the database in a       significant way will also affect the ability of the many clients which       search the server database by direct connection with its DBMS. And it is       also more difficult to use more than one application running on/per client       when the data for several applications all must agree to the same DBMS       format. And a change in any application on the clients could again affect       the multi-user system of clients to the degree that their performance of any       other tasks may be stopped by a need to repair and upgrade that one       application being altered.              One can yet imagine the solution to the problem issues of a two-tier network       architecture. The issue of several applications running on a single client       being dependent on the single machine, can be addressed by giving each of       the applications an application server. A corporate research department has       a server, and the financial department has a server, and personnel has a       server for its application also. But unless we are considering an       enterprise with 5000 employees, too many servers can be a problem.              Three-tier Network Architecture       But first consideration is that you do not want a system where all the       clients are directly connected to the main server. So it is necessary to       have at least two servers, which makes this a three-tier network       architecture. But the use of object oriented-programming can allow more       reliability of use for an application server which has been (would be)       designated to run several different applications.              In a three-tier architecture network system the clients still maintain       responsibility for the user interface, and perhaps a limited amount of       validation. Because these clients in a three-tier architecture contain a       minimum amount of functionality, these programs are sometimes referred to as       thin-clients. The use of the web itself to manage individual clients, i.e.,       the Microsoft Windows operating system updates is not related to the core       functions of the application.              There is some basis to the fact that the use of a good relational database       which is object-oriented can be shared between applications. So that part       of the server issue can be partly controlled. The remainder is probably       related to sufficient programming of applications and the use of classes       that will work well together. The application suites such as Microsoft       Office and Lotus Smart Suite can be considered as classes which can be       managed using script. And most of this application becomes more manageable       from the use of such integrated applications as a foundation. But what I       really wanted to mention was that early techniques for using multiple file       compiling and #include files, in C language were a method to change parts of       the program without interfering with the interface, or some of the       non-involved objects.              Systems and Models       A system is an assemblage of interrelated elements. One example is a       mechanism, which may mean the related parts of a machine (as in computer       processor, memory, and other components of a computer) or the related parts       or stages of any process (the mechanism of creation in art and science, the              [continued in next message]              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca