Anda di halaman 1dari 18

NAME - KRUSHITHA.V.

P
ROLL NO. - 520791371
ASSIGNMENT 2
SUBJECT - MC0073
SYSTEM PROGRAMMING

Master of Computer Application (MCA) Semester 3


MC0073 System Programming
Assignment Set 2

1. Explain the following with respect to Loaders:


A) Design of an Absolute Loader
B) A Simple Bootstrap Loader

Ans:
When a program needs to be executed, the OS first extracts all relevant file
information (usually from the file header), and carries out any necessary actions before
putting it into memory; in other words the OS simply decodes the object file into an
understandable form in memory.

This behavior is often described as loading of a

program.
Computer program that transfers data from offline memory into internal storage.
This is the definition of Loader.
An operating system utility that copies programs from a storage device to main
memory, where they can be executed.

In addition to copying a program into main

memory, the loader can also replace virtual addresses with physical addresses. Most
loaders are transparent, i.e., we cannot directly execute them, but the operating system
uses them when necessary.
A binary object file is either an executable file that runs on a particular machine or
a file containing object code that needs to be linked. The object code or executable
code is generated by a compiler or by an assembler.
A) Design of an Absolute Loader:
As the example loader does not need to perform functions such as linking and
program relocation, its operation is very simple. All functions are accomplished in a
single pass. The Header record is checked to verify that the correct program has been
presented for loading (and that it will fit into the available memory). As each Text record
is read, the object code it contains moved to the indicted address in memory. When the

End record is encountered, the loader jumps to the specified address to begin execution
of the loaded program.
Each pair of bytes from the object program record must be packed together into
one byte during loading.

Each printed character represents one byte of the object

program record. On the other had, each printed character represents one hexadecimal
digit in memory ie, a half byte.

Most machines store object programs in a binary form,

with each byte of object code stored as a single byte in the object program.
Program:
Begin
Read Header record
Verify program name and length
Read first Text record
While record type <> E do
Begin
(if the object code is in character form, convert into internal representation)
Move object code to specified memory
Read next object program record
End
Jumps to address specified in End record
end
B) A Simple Bootstrap Loader
A very small program (usually residing in ROM) which reads a fixed location on a
disk (ex. The MBR) and passes control over to it.

The data residing on that fixed

location is, in general, slightly bigger and more sophisticated,a nd it then takes
responsibility for loading the actual operating system and passing control to it.

2. Write about Deterministic and Non-Deterministic Finite Automata with suitable


numerical examples.
ANS:

Finite State Automation or Finite State Machine is an abstract machine consisting


of a set of states (including initial state), a set of input events, a set of output events, and
a state transition function. The function takes the current state and an input event and
returns the new set of output events and the next state. Some states may be designed
as terminal states. The state machine can also be viewed as a function which maps a
ordered sequence of input events into a corresponding sequence of (sets of) output
events.
A further distinction is between Deterministic and non-deterministic automata. In
Deterministic automata, for each state there is at most one transition for each possible
input. In non-deterministic automata, there can be more than one transition from a given
state for a given possible input. Non deterministic automata are usually implemented by
converting them to deterministic automata in worst case, the generated deterministic
automation is exponentially bigger than the non-deterministic automation (although it can
usually be substantially optimized).
The standard acceptance condition for non-deterministic automata requires that
some computation accepts the input. Alternating automata also provide a dual notion,
where for acceptance all non-deterministic computations must accept.
Deterministic Finite Automata( DFA):
Definition;
A deterministic finite automation (DFA) is a 5-tuple: (S, , T, s, A)
. an alphabet ()
. a set of states (S)
. a transition function (T : S x S).
. a start state (s S)
. a set of accept states (A __ S)
The machine starts in the start state and reads in a string of symbols from its
alphabet. It uses the transition function T to determine the next state using the current
state and the symbol just read. If, when it has finished reading, it is in an accepting
state, it is said to accept the string, otherwise it is said to reject the string. The set of
strings it accepts form a language which is the language the DFA recognizes.

Non-Deterministic Finite Automata (N-DFA):


Definition
A Non-deterministic Finite Automation (N-DFA) is a 5-tuple: (S, , T,s,A)
- an alphabet ()
- a set of states (S)
- a transition function (T: S x S).
- a start state (s S)
- a set of accept states (A S)
Where P(S) is the power set of S and is the empty string. The machine starts
in the start state and reads in a string of symbols from its alphabet. It uses the transition
relation T to determine the next state (s) using the current state and the symbol just read
or the empty string. If, when it has finished reading, it is in an accepting state, it is said
to accept the string, otherwise it is said to reject the string. The set of strings it accepts
form a language, which is the language the NFA recognizes.

4. Write about different Phases of Compilation.


ANS:
A compiler is a program that reads a program in one language, the source
language and translates into an equivalent program in another language, the target
language.
The translation process should also report the presence of errors in the source program.

Source
Program

Compiler

Target
Program

Error
Messages

There are two parts of compilation.


The analysis part breaks up the source program into constant piece and creates an
intermediate representation of the source program.
The synthesis part constructs the desired target program from the intermediate
representation.

There are 6 phases a typical compiler will implement. The major reason
why separate a compiler into 6 phases is probably simplicity.
Compiler design is thus full of ``Divide and Conquer'' strategy,
component-level design, reusabililty and performance optimization.

Lexical Analysis
Syntax Analysis
Error Recovery
Scope Analysis
Type Analysis

The process of lexical analysis is


called lexical analyzer, scanner, Code
Generation
or tokenizer. Its purpose is to

break a sequence of characters

Out
Target
Program

into

subsequences

called

tokens. The syntax analysis phase, called parser, reads tokens and validates them in
accordance with a grammar. Vocabulary, i.e., a set of predefined tokens, is composed of
word symbols (reserved words), names (identifiers), numerals (constants), and special
symbols (operators). During compilation, a compiler will find errors such as lexical,
syntax, semantic, and logical errors. If a token is found not belonging to the vocabulary,
it is an lexical error. A grammar dictates the syntax of a language. If a sentence does not
follow the syntax, it is called a syntax error. Semantic errors is like assigning an integer
to a double vaiable! Logical errors simply refer to the program logic is not correct, even
though it is syntactically and semantically correct!

Java Compiler and Environment:


Anything which converts 'Java Language' to _any_ other form is a Java compiler, as
the term is commonly understood. Only something which converts 'Java Language' to
'JVM language' is a Java compiler, as defined by Sun. It is sad that the legal system
forces such a distinction of technical terms to be important for political reasons
Java "compilers" that convert Java source into something executed in the host
machine's native environment (Windoze, Linux, VMS, OS/9, VxWorks, etc.) are not, by
definition, "Java compilers." They are Java converters. This distinction is not simply a
matter of semantics; it goes to the very heart of the Java Machine and its usefulness.
There are also virtual CPUs, such as the Java Virtual Machine. These could have
been implemented in hardware, but happen to be implemented in software. In the future
we may see Java Virtual Machine's implemented in hardware---then we will have to
distinguish between those that are Java Virtual Machine's and those that are true Java
Machine.

6. Describe the following with respect to Software Tools for Program


Development:
A) Compilers

B) Editors

C) Debuggers

D) Interpreters

A) Compilers
A compiler is a computer program (or set of programs) that transforms source code
written in a programming language (the source language) into another computer
language (the target language, often having a binary form known as object code). The
most common reason for wanting to transform source code is to create an executable
program.
The name "compiler" is primarily used for programs that translate source code from a
high-level programming language to a lower level language (e.g., assembly language or
machine code). A program that translates from a low level language to a higher level one
is a decompiler. A program that translates between high-level languages is usually called
a language translator, source to source translator, or language converter. A language
rewriter is usually a program that translates the form of expressions without a change of
language.
A compiler is likely to perform many or all of the following operations: lexical analysis,
preprocessing, parsing, semantic analysis, code generation, and code optimization.

Program faults caused by incorrect compiler behavior can be very difficult to track down
and work around and compiler implementors invest a lot of time ensuring the
correctness of their software.
The term compiler-compiler is sometimes used to refer to a parser generator, a tool
often used to help create the lexer and parser.
Structure of compiler
Compilers bridges source programs in high-level languages with the underlying
hardwares. A compiler requires 1) to recognize legitimacy of programs, 2) to generate
correct and efficient code, 3) run-time organization, 4) to format output according to
assembler or linker conventions. A compiler consists of three main parts: frontend,
middle-end, and backend.
Frontend checks whether the program is correctly written in terms of the programming
language syntax and semantics. Here legal and illegal programs are recognized. Errors
are reported, if any, in a useful way. Type checking is also performed by collecting type
information. Frontend generates IR (intermediate representation) for the middle-end.
Optimization of this part is almost complete so much are already automated. There are
efficient algorithms typically in O(n) or O(nlog n).
Middle-end is where the optimizations for performance take place. Typical
transformations for optimization are 1) removal of useless or unreachable code, 2)
discovering and propagating constant values, 3) relocation of computation to a less
frequently executed place (e.g., out of a loop), 3) specializing a computation based on
the context. Middle-end generates IR for the following backend. Most optimization efforts
are focused on this part.
Backend is responsible for translation of IR into the target assembly code. The target
instruction(s) are chosen for each IR instruction. Variables are also selected for the
registers. Backend utilizes the hardware by figuring out how to keep parallel FUs busy,
filling delay slots, and so on. Although most algorithms for optimization are in NP,
heuristic techniques are well-developed.

A diagram of the operation of a typical multi-language, multi-target compiler.


Hardware compilation
The output of some compilers may target hardware at a very low level, for example a
Field Programmable Gate Array (FPGA) or structured Application-specific integrated
circuit (ASIC). Such compilers are said to be hardware compilers or synthesis tools
because the programs they compile effectively control the final configuration of the
hardware and how it operates; the output of the compilation are not instructions that are
executed in sequence - only an interconnection of transistors or lookup tables. For
example, XST is the Xilinx Synthesis Tool used for configuring FPGAs. Similar tools are
available from Altera, Synplicity, Synopsys and other vendors
C) Debuggers
A debugger or debugging tool is a computer program that is used to test and debug
other programs (the "target" program). The code to be examined might alternatively be
running on an instruction set simulator (ISS), a technique that allows great power in its
ability to halt when specific conditions are encountered but which will typically be

10

somewhat slower than executing the code directly on the appropriate (or the same)
processor. Some debuggers offer two modes of operation - full or partial simulation, to
limit this impact.
When the program "crashes" or reaches a preset condition, the debugger typically
shows the position in the original code if it is a source-level debugger or symbolic
debugger, commonly now seen in integrated development environments. If it is a lowlevel debugger or a machine-language debugger it shows the line in the disassembly
(unless it also has online access to the original source code and can display the
appropriate section of code from the assembly or compilation).(A "crash" happens when
the program cannot normally continue because of a programming bug. For example, the
program might have tried to use an instruction not available on the current version of the
CPU or attempted to access unavailable or protected memory.)
Typically, debuggers also offer more sophisticated functions such as running a program
step by step (single-stepping or program animation), stopping (breaking) (pausing the
program to examine the current state) at some event or specified instruction by means of
a breakpoint, and tracking the values of some variables. Some debuggers have the
ability to modify the state of the program while it is running, rather than merely to
observe it. It may also be possible to continue execution at a different location in the
program to bypass a crash or logical error.
Hardware support for debugging
Most modern microprocessors have at least one of these features in their CPU design to
make debugging easier:

hardware support for single-stepping a program, such as the trap flag.

An instruction set that meets the Popek and Goldberg virtualization requirements
makes it easier to write debugger software that runs on the same CPU as the
software being debugged; such a CPU can execute the inner loops of the
program under test at full speed, and still remain under the control of the
debugger.

In-System Programming allows an external hardware debugger to re-program a


system under test (for example, adding or removing instruction breakpoints).
Many systems with such ISP support also have other hardware debug support.

11

Hardware support for code and data breakpoints, such as address comparators
and data value comparators or, with considerably more work involved, page fault
hardware

JTAG access to hardware debug interfaces such as those on ARM architecture


processors or using the Nexus command set. Processors used in embedded
systems typically have extensive JTAG debug support.

Microcontrollers with as few as six pins need to use low pin-count substitutes for
JTAG, such as BDM, Spy-Bi-Wire, or DebugWire on the Atmel AVR. DebugWire,
for example, uses bidirectional signaling on the RESET pin.

List of debuggers

Winpdb debugging itself.


AppPuncher Debugger (used to debug Rich Internet Applications)
AQtime
CA/EZTEST (Cics Interactive test/debug)
CodeView
DBG a PHP Debugger and Profiler
dbx
DDD (Data Display Debugger)
Distributed Debugging Tool (Allinea DDT)
DDTLite Allinea DDTLite for Visual Studio 2008
DEBUG the built-in debugger of DOS and Microsoft Windows
Debugger for MySQL
Opera Dragonfly
Dynamic debugging technique (DDT), and its octal counterpart Octal Debugging
Technique
Eclipse
Embedded System Debug Plug-in for Eclipse
FusionDebug
gDEBugger OpenGL, OpenGL ES and OpenCL Debugger and Profiler. For
Windows, Linux, Mac OS X and iPhone
GNU Debugger (GDB)
Intel Debugger (IDB)
Insight

12

Parasoft Insure++
iSYSTEM In circuit debugger for Embedded Systems
Interactive Disassembler (IDA Pro)
Java Platform Debugger Architecture
Jinx, a whole-system debugger for heisenbugs. It works transparently as a
device driver.
JSwat open-source Java debugger
MacsBug
Nemiver graphical C/C++ Debugger for the GNOME desktop environment
OLIVER (CICS interactive test/debug) - a GUI equipped instruction set simulator
(ISS)
OllyDbg
Omniscient Debugger (Forward and backward debugger for Java)
pydbg
IBM Rational Purify
RealView Debugger - Commercial debugger produced for and designed by ARM
sdb
SIMMON (Simulation Monitor)
SIMON (Batch Interactive test/debug) - a GUI equipped instruction set simulator
(ISS) for batch
SoftICE
Software Diagnostics Developer Edition
TotalView
Turbo Debugger
Ups C, Fortran source level debugger
Valgrind
VB Watch Debugger debugger for Visual Basic 6.0
Microsoft Visual Studio Debugger
WinDbg
WinGDB - Debugging with GDB under Visual Studio. Remote Linux (via SSH),
MinGW, Cygwin, embedded systems.
Xdebug PHP debugger and profiler.
Zeta Debugger: Debugging with Visual Studio and Borland.

D) Interpreters
A program that executes instructions written in a high-level language. There are two
ways to run programs written in a high-level language. The most common is to compile
the program; the other method is to pass the program through an interpreter.
An interpreter translates high-level instructions into an intermediate form, which it then
executes. In contrast, a compiler translates high-level instructions directly into machine
language. Compiled programs generally run faster than interpreted programs. The
advantage of an interpreter, however, is that it does not need to go through the
compilation stage during which machine instructions are generated. This process can be

13

time-consuming if the program is long. The interpreter, on the other hand, can
immediately execute high-level programs. For this reason, interpreters are sometimes
used during the development of a program, when a programmer wants to add small
sections at a time and test them quickly. In addition, interpreters are often used in
education because they allow students to program interactively.
Both interpreters and compilers are available for most high-level languages. However,
BASIC and LISP are especially designed to be executed by an interpreter. In addition,
page description languages, such as PostScript, use an interpreter. Every PostScript
printer, for example, has a built-in interpreter that executes PostScript instructions.
Advantages and disadvantages of using interpreters
Programmers usually write programs in high level code which the CPU cannot execute.
So this source code has to be converted into machine code. This conversion is done by
a compiler or an interpreter. A compiler makes the conversion just once, while an
interpreter typically converts it every time a program is executed (or in some languages
like early versions of BASIC, every time a single instruction is executed).
Development cycle
During programs development the programmer makes frequent changes to source code.
A compiler needs to make a compilation of the altered source files, and link the whole
binary code before the program can be executed. An interpreter usually just needs to
translate to an intermediate representation or not translate at all, thus requiring less time
before the changes can be tested.
Distribution
An interpreted program can be distributed as source code. It needs to be translated in
each final machine, which takes more time but makes the program distribution
independent to the machine's architecture.

14

Execution environment
An interpreter will make source translations during runtime. This means every line has to
be converted each time the program runs. This process slows down the program
execution and is a major disadvantage of interpreters over compilers. Another main
disadvantage of interpreter is that it must be present on the machine as additional
software to run the program
Structure

3. Explain wiht suitable numerical examples the concepts of Moore Machine and
Mealay Machine
ANS:
Mealay Machine

In the theory of computation, a Mealy machine is a finite state transducer that generates
an output based on its current state and input. This means that the state diagram will
include both an input and output signal for each transition edge. In contrast, the output of
a Moore finite state machine depends only on the machine's current state; transitions
are not directly dependent upon input.

15

The name Mealy machine comes from that of the concept's promoter, George H. Mealy,
a state-machine pioneer who wrote "A Method for Synthesizing Sequential Circuits" in
1955.[1]
Mealy machines provide a rudimentary mathematical model for cipher machines.
Considering the input and output alphabet the Latin alphabet, for example, then a Mealy
machine can be designed that given a string of letters (a sequence of inputs) can
process it into a ciphered string (a sequence of outputs). However, although you could
use a Mealy model to describe the Enigma, the state diagram would be too complex to
provide feasible means of designing complex ciphering machines

The state diagram of a simple Mealy machine

A Mealy machine is a 6-tuple, (S, S0, , , T, G), consisting of the following:

a finite set of states (S)

a start state (also called initial state) S0 which is an element of (S)

a finite set called the input alphabet ()

a finite set called the output alphabet ()

a transition function (T : S S) mapping a state and the input alphabet to


the next state

an output function (G : S ) mapping each state and the input alphabet to


the output alphabet

16

Moore Machine
In the theory of computation, a Moore machine is a finite state transducer where the
outputs are determined by the current state alone (and do not depend directly on the
input). The state diagram for a Moore machine will include an output signal for each
state. Compare with a Mealy machine, which maps transitions in the machine to outputs

The Moore machine state diagram with x, y, z as input and a, b, c as output.


A Moore machine can be defined as a 6-tuple ( S, S0, , , T, G ) consisting of the
following:

a finite set of states ( S )

a start state (also called initial state) S0 which is an element of (S)

a finite set called the input alphabet ( )

a finite set called the output alphabet ( )

a transition function (T : S S) mapping a state and the input alphabet to


the next state

an output function (G : S ) mapping each state to the output alphabet

The number of states in a Moore machine will be greater than or equal to the number of
states in the corresponding Mealy machine. This is due to the fact that each transition in
a Mealy machine can be associated with a corresponding, additional state mapping the
transition to a single output, hence turning a possibly partial machine into a complete
machine.

17

18

Anda mungkin juga menyukai