Subido por Heriberto Enrique Maury Ramirez


See discussions, stats, and author profiles for this publication at:
A Complexity Index for the Design Process
Conference Paper · August 2001
2 authors, including:
Bimal Kumar
Northumbria University
Some of the authors of this publication are also working on these related projects:
Development of Level 2 BIM Strategy for NHS Scotland View project
BIM Strategy for NHS Scotland View project
All content following this page was uploaded by Bimal Kumar on 14 June 2016.
The user has requested enhancement of the downloaded file.
This paper is aimed at developing an approach for measuring the complexity index of a
design process. In the absence of any unified definition about complexity of a design
process it is proposed that it is the ‘total information content’ associated in the chosen
complexity generating factors (CGF) within the different activities of a design process.
The amount of information content acts as a measuring yardstick for establishing the
complexity index (CI) of the design process. The measure of complexity would be an
advantage to managers/ project planners during the planning phase of similar design
processes, identifying the reasons of complexity so that the effective measures are
directed to reduce it and a practical acceptance of the definition of complexity.
The results presented in this paper presupposes the involvement of human resources in all
the aspects of a design process and complexity of the design process is dependent on the
user’s skill and the context in which it is measured.
Keywords: Complexity Generating Factor (CGF), Partial Complexity Generating Factor
(Pcgf), Information Parameters (IPs), Partial Complexity Index (PCI), Overall
Complexity Index (OCI), Partial Information Content, (PCI).
1. Introduction
1.1 Defining Complexity
Complexity is difficult to define. Johnson reports that there is still no agreed – upon
definition, much less a theoretically – rigorous formalization, despite the fact that
complexity is currently a “hot” research topic. Johnson quotes Dan Stein, chairman of the
physics department at the University of Arizona: “ Everybody talks about it.[But] in the
absence of a good definition, complexity is pretty much in the eye of the beholder.”
Researchers have tried to define it based on the characteristics of complexity in the
context of their own fields. But not a single definition seems comprehensive enough to
suit all the situations where complexity exists. Complexity is a very slippery term and
means different things to different people. Claud E. Shannon was among the first to
propose a measure of complexity, which was based on the very reasonable assumption
that the amount of information processed by the system in question reflects its complexity
( But this idea was limited to the problems
related in the field of information theory. Weaver (1948) has discussed on the ranges of
complexity and is quantifiable by analytic mathematics concentrating on specific
elements. The approach suggests a limitation on account of the nature of complexity,
which was typical of 17th, 18th and 19th century sciences Klir (1985).
Perrow (1965), Mohr (1971), Waxman (1996) have all defined the complexity in terms of
the degree of difficulty of the search process in performing the task, the amount of
thinking time required to solve work-related problems and the body of knowledge that
may provide guidelines for performing the task which all appear to be a subjective issue.
Balding (1971) have been able to establish a linear relationship between product
complexity and savings on account of manufacturing complex components on a NC
machine rather than on conventional machines but this relationship does not address the
complexity of the involved design process (manufacturing). Any conclusion regarding the
complexity of the design process involved on the above basis would be erroneous as
shown in Fig.1
Ashby (1973) has highlighted the idea that complexity is directly related to a person’s
interest and has varied meanings depending on the context of study. But this definition
does not include the list of interests a person would be interested in and how far these
interests would have an important impact on the outcome of the result.
Thompson (1981) considers complexity as a measure of difficulty in co-ordinating a
production process comprising of activities that lack uniformity of work. Kusiak et. al
(1993) have suggested the decomposition approach for detecting parallelism among
activities that reduces the product development time. At the same time the measure of
difficulty needs to be identified without which it would not be a trivial task to arrive to
any result regarding the nature of complexity. Wallace (1987) has identified design task,
design team, design techniques, and design output as the one’s to be influenced by the
complexity of the design process. This indicates an influence of the complexity on the
overall design rather than the definition or causes of complexity. In the area of measuring
software complexity Chaitin (1987) and others hold the view that simple tasks can be
done by short computer programs and vice versa , measured in terms of ‘algorithmic
complexity’. The basic idea is that the length of its most compact description measures
complexity of a task. But the length of even the shortest computer program depends upon
the design of the software as well as coding. The problem with this definition, as Chaitin
concedes, is that random sequences are invariably more complex because in each case the
recipe is as long as the whole thing is specified; it cannot be “compressed”.
Zurn (1991) has quantified a number of factors and incorporated them in a model that can
be used for assessing new product introduction. The quantified factors include product
innovation, product complexity, and design maturity, schedule pressure and others. Ahn
and Crawford (1994) have adapted a number of software metrics for analysing the
complexity of computational design processes in terms of control structure, data
dependencies between design tasks. There approach is based on the hypothesis that
complexity of engineering design processes (with respect to computational aspects) can
be analysed and evaluated by adapting criteria identified from “software complexity” in
Computer science. They have drawn similarities between computer programs and
computational design processes. This method would work with only the parts of the
design process, which have computational contents and would not address the qualitative
aspect of the design process.
Pugh (1996) has quantified the complexity of a product and defined it as being
proportional to number of parts; number of different types of parts, number of
interconnections and interfaces and the number of functions that the product is expected
to perform. He has also stressed the need of combining the quantified complexity with an
approach called ‘load lines’ to reach the goal of simplicity. This approach suggests a
measure for a product complexity without addressing the complexity of the associated
design process.
Biren(1998 ) is of the view that it is the combination of new and old practices, such as
old-fashioned habits, new life cycle environment, changes, and mounting regulations,
which imparts complexity to many of the product development efforts. These causes of
complexity in the absence of any of its measurability are more towards the subjectivity of
the definition.
1.2 The Growing Complexity in Companies
Complexity is a term normally used in everyday situations to describe a characteristic,
which is not yet possible to quantify precisely. On account of this the lead-time for design
and operations planning accounts for 60% of the delivery time [Wiendahl] for customerspecific products. Research has often shown that both the development times for the
products or variants and the production lead times in the direct areas are about 50% too
high, taking international competition as a basis.
Frizelle & Woodcock (1995), Frizelle (1995), Frizelle (1998), Braha et. al (1999) have
classified 'complexity' as having structural and operational aspects. The only difference in
the works of Frizelle and Braha is that the former has assumed the structural component
of the complexity to arise from the impact the product structure has on the resources that
will produce it whereas the latter states that it is a function of its representation. Similarly
for the functional part the former takes into account the uncertainty involved in
manufacturing and the latter believes that design complexity is a function of its
probability of successfully achieving the required specifications.
Complexity in production arises on account of the production structures known as
‘structural complexity’ and as ‘dynamic complexity’ because of the production
procedures involved. The former type of complexity (structural complexity), which exists
on account of the production structures, is due to the superposition of the product range
on the resources required for its manufacture. A product that is simple for one facility to
manufacture may be complex for another. Moreover an increase in either the product
range or the number of processes involved will result in the overall structure becoming
more complex. The number of processes and products involved thus measures structural
complexity. Whereas the latter type of complexity (dynamic complexity), which is more
related to, the operational part deals with the uncertainty involved in manufacturing
which appear once the plant starts to manufacture. Typical examples are plant
breakdowns, absenteeism, shortages and unbalanced flow. Their effect is to generate
queues. The resulting complexity is called operational complexity.
1.3 Need for measuring complexity
It has been established that measurement is vital for controlling any process because it is
difficult to manage what cannot be measured (DeMarco,T. 1982). In addition to many
researchers noted physicist Lord Kelvin, software engineers Tom Demarco, researcher
Chryssolouris, 1994, have stressed importance of measuring complexity. Presently design
process complexity is highly subjective and is the cause of many engineering and
management related problems.
The complexity of design process is the cause of many management problems in
industrial companies (Frizelle 2000). Practitioners in the area of project management
frequently describe their projects as simple or complex (Baccarini, 1996) when they are
discussing management issues. This indicates a practical acceptance that complexity
makes a difference to the management of projects. Therefore an understanding of project
complexity and how it might be managed is of significant importance.
Complexity of products and its associated design processes have always been of some
concern to mankind. The present way of attributing complexity to a product and its
design process appears to be a simple mental task at which the design evaluators are
highly proficient. This mental task is frequently undertaken every time they make
interaction with the product or/ and its design process. Researchers ( Waxman 1996) have
defined complexity as a subjective matter and to a greater extent are dependent on the
user solving a problem. Design evaluators have often maintained records regarding the
complexity of products or its related design processes as mental models. However
looking closely it is found to be a very subjective issue of attributing complexity to
anything in a qualitative sense. Usually the design evaluators are able to assign
complexity to products or/and its related design processes but often are not able to say
how they arrive at that conclusion.
Researchers (Chryssolouris, 1994) call for increased efforts to make this complexity
quantifiable. It is only after this measurability is achieved can fresh approaches be
developed for reducing complexity in production systems and so produce a systematic
reduction in complexity.
A transition from the qualitative understanding of complexity to a quantitative
understanding would be a highly desirable and necessary step towards the understanding
of overruns of design projects (Calinescu, A. et. al. 2000). Design projects are typically
plagued with schedule and cost overruns ranging between 41% - 258% and 97% - 151%
respectively (Norris, K.P. 1971, Murmann, P.A. 1994). A factor in these overruns can be
attributed to projects being more complex than originally anticipated at the I nitial
planning stages. Although research has been carried out in this area there is currently no
method for measuring the complexity of the design process. In order to schedule projects
and estimate costs more accurately it is essential that complexity of the design process
can be measured.
1.4 Complexity Measurement
One of the earliest researcher to establish the fact that complexity could be measured had
been Von Neumann (Gidado, 1997) according to which complexity can be measured
provided if it was to be related to such things as the dimension of a state space, the length
of a programme or the magnitude of a ‘cost’ in money or time. Rosen (1987) also shares
the idea of a threshold value of complexity as expressed by Von Neumann.
Griffin (1997) has developed number of metrics including product complexity,
management complexity, and amount of change to measure product development cycle
One of the most respected attempts for developing an objective measure of complexity
was made by John Tyler Bonner, of Princeton University. He suggested to count the
number of different cell types in the organism can perform, and that is the clue to
complexity. Bonner was able to show higher complexity in larger species by this
measure, but he did not try to determine whether it increased through evolutionary time.
Salingaros (1997) has established the idea of architectural temperature and harmony in
order to quantify the qualitative attributes of the building structures to establish the
complexity index of the world famous building structures. His model uses ideas of
Christopher Alexander to estimate certain intrinsic qualities of a building and predicts a
building’s emotional impact. His model on measuring the complexity of architectural
buildings is quite subjective and lacks objectivity in the absence of a scale.
Pressing Jeffery defines complexity as the minimal cost of computing an approximate
solution to the problem. So instead of basing complexity on number of structural levels or
capacity for adaptation, here complexity is based on a minimum set of resources (cost)
( page 3)
Bashir et. al(1999) have found out the product complexity on the assumption that it
depends on the number of functions a product is designed to deliver and the depth of its
functional tree. The use of this approach is limited because of the depth of functional tree,
which in turn would depend on the function of the product and the end user of that
product. According to this method a product could have more than one complexity index,
as it would be governed by the function of the product. At the same time it does not
address the underlying design process in the manufacture of that product According to
Seth Llyod (Suh 1999) there are some three- dozen different ways scientists use the word
Frizelle & Woodcock (1995), Frizelle (1995), Frizelle (1998), Braha et. al (1999) have
classified 'complexity' as having structural and operational aspects. The only difference in
the works of Frizelle and Braha is that the former has assumed the structural component
of the complexity to arise from the impact the product structure has on the resources that
will produce it whereas the latter states that it is a function of its representation. Similarly
for the functional part the former takes into account the uncertainty involved in
manufacturing and the latter believes that design complexity is a function of its
probability of successfully achieving the required specifications.
Suh (1990), Haik, Calinescu, Davidson have indicated the
importance of information content while dealing with the complexity in their respective
areas of interest. These researchers have drawn analogy from the pioneering work of
Shannon (1948) on mathematical theory of communication where the amount of
information (for example, in a message transmitted between individuals) increases as the
number of possible messages increases and decreases as the number of possible messages
decreases. A greater set of possible messages corresponds to a greater uncertainty on the
part of the recipient as to the message content and higher information content in the
message. Researchers like Suh (1990), Haik, Calinescu, Davidson have been able to measure the information content in their areas of interest but the
scale of information measurement is not considered at any stage.
Bennet (Suh 1990) coined a different measure of complexity called ’logical depth’. It is
to gauge how long it would plausibly take for a computer to go from a simple blueprint to
the final product. Though useful, it seems to be limited to processes in which there is a
logical structure of some sort. Tang 2001 do not consider complexity and
complicatedness of a system as synonyms. They consider complexity as an inherent
property of systems and complicatedness as a derived property that characterizes an
execution unit’s ability to mange a complex system.
2. Generators of Complexity and Context
2.1 Complexity Generating Factors (CGFs)
These are the proposed factors, which have been assumed to impart complexity to a
design process. The choice of Complexity Generating Factors is dependent on the user
and is used as a first step for measuring the complexity of the design process in a
particular part of the context. There is a comprehensive list of CGFs for every part of the
context. Annexure: A
2.2 Partial Complexity Generating Factors (Pcgfs)
All the Complexity Generating Factors are subdivided into partial complexity generating
factors (Pcgfs). This division of Cgfs into Pcgfs is necessary to facilitate measurement of
complexity on account of their CGFs. All the Pcgfs would contribute some partial
complexity (PC) in their own way towards its CGF and the sum total of these PCs would
give an index known as partial complexity index (PCI) of the design activity on account
of the chosen CGF. Annexure: B
3. Proposed Understanding about the complexity of the design process
3.1 Proposed Definition of Complexity
The complexity of a design process with reference to an user is defined in terms of the
information content associated with the chosen Complexity Generating Factors of design
activities in a given part of context of a design process.
3.2 Equation Development
Information content has been used as a measuring tool for complexity because of a
number of reasons ----- people involved in the design process deals with information in
various form like reading drawings, selecting processes, teams, materials and likewise
many more. Wiendahl 1994 have also established through surveys that nowadays
75% of the employees in industrial companies do not work with materials but with
The concept of ‘ entropy’, which is used in thermodynamics to quantify the ‘disorder’
that arises in a system due to ‘variety’ and ‘uncertainty’, has been used in measuring the
complexity of a design process. In information theory too the concept of entropy has been
defined as the ‘expected amount of information’ to describe a system covering all the
various possible states of the system as given by Shannon’s equation for measuring
information content
I= -  pi log 2 pi
p i = Probability of the system to be in a particular state
Similarly an equation has been proposed for measuring the Information Content in a
design activity/ process as given below presupposing the involvement of different
resources (e.g. people, machine and material) in different departments.
For a design activity the total information content ‘Id’ could be written as --G
Id = -  pi log 2 pi
r 1 i 1 j 1
And for design process total information content ‘Ip’ could be written as --N
Ip = - 
n 1
 p log
r 1 i 1 j 1
pi ----------------------------------------(3)
N= Design activities
G= Complexity Generating Factors (CGFs)
P= Information Parameters (IPs)
S= States of Information Parameters (IPs)
pi = probability of an Information Parameter (IP) ‘i’ to be in state ‘j’
4. Description of the Model
The terminology used in this model can be found in appendix as Annexure:Terminology
and the schematic diagram of the model is attached as Annexure D.
The model has three main modules---- contextual, partial complexity generating factors
and information processing. User input is needed for making a selection for the type of a
design activity and then the part of the context in which the complexity has to be
measured. Each of the part of context has a set of predefined complexity generating
factors, which are the first step for measuring the complexity of the design activity. These
CGFs have their further subdivisions into partial complexity generating factors (Pcgfs),
which are instrumental in ‘activating’ the information parameters (IPs) without which the
progress of the design activity is not possible. After the appropriate selection of the
information parameters and the ‘states’ the information-processing module calculates the
information content using the equation mentioned. The Complexity Index Generator
(CIG) within the information-processing module shown in
Annexure D is responsible
for generating the partial or/and overall complexity index (PCI/OCI) of a design process.
It consists of the following:
Matrix showing the different ‘CAUSES’ for complexity while managing the
Information Parameters (IPs)/States/Solution steps on account of a selected
complexity generating factor.
Development of Assessment Scales for measuring the identified ‘CAUSES’ in
the matrix explained at serial number 1 above.
Analysis of the identified ‘ CAUSES’ on user defined degrees within the
assessment scales.
Compiling and interpreting the information for complexity as a result of steps
mentioned at serial numbers 2 and 3 respectively.
5. Working of Model through an example
Description of the example:
An operator has to accomplice a design process (say manufacturing a gear) in which one
of the design activities is say machining. The objective is to measure the complexity of
the design activity.
The very first step is to fix the part of the context in which this activity has to be
measured for complexity (Say Work). Next comes the selection of the complexity
generating factors, which is user dependent (Say Usage of Resources). After this the
selection is made regarding the partial complexity generating factor (Say Operator) as
this would help in activating the Information Parameters in the Partial CGFs module.
Annexure D
After fixing the above information the following steps are followed:
This method presupposes that the PIC of a design activity has been worked out using
equation (2) within a specified set of conditions. Annexure E.
Either accept the matrix of ‘CAUSES’ as such or alter the position of causes as per the
operator’s idea. This step is important as the X-axis of the matrix denotes the distribution
of Partial Information Content (PIC) in an increasing order, which as per definition is a
measure of complexity. For some operators some causes may be more relevant than the
causes from the point of view of complexity so in that case those causes would be kept
far from the origin and vice versa. Annexure F
Distribute the Partial Information Content (PIC) found in step: 1 along the X-axis in an
increasing order as the value of complexity increases along the axis. This distribution of
PIC has been made on the fact that PIC obtained in step: 1 is a result of the use of the
IPs/States/Solution Steps of the design activity, which form a part of this matrix too. But
one has to be careful while distributing the partial information content as it should be
distributed in a logical way and should not cross its maximum value. By logic the
operator should not assign a higher contribution of the PIC to a cause like ‘Manageable
IPs’ which is at (I, A) than to a cause, which says ‘Broad range of IPs which are vague or
ambiguous’ at (III, A) which is placed in a higher level and vice versa so here the user
has to use his experience and commonsense.
Viz. in this case PIC of 50 has been distributed as 10, 20, 40 and 50 within the different
levels; it could have been 10, 20, 30, 50 or 10, 30, 40, 50 depending upon the user.
Annexure F
Identify the probable causes(s) from the ‘Matrix showing the different ‘CAUSES’ for
complexity while managing the Information Parameters (IPs)/States/Solution steps using
the “Problem Solving” skill.
For each of the identified cause(s) in a particular level follow the steps given belowSTEP: 5
Go to the table of ‘Assessment on Technicality Scale’ and fill in the details in the sub
table of ‘Value of st for chosen CAUSE (S)’for all the causes identified in step: 4. Repeat
the same exercise for the tables of ‘ Assessment on Analysability and Difficulty Scales’.
Annexure: TN, Annexure: AN, Annexure: DN
Go to the table of ‘Summary of Assessment of Causes on different scales’ to fill in the
details regarding causes and individual assessment values for technicality, analysability
and difficulty (st , sa, sd). Annexure: SN
Go to the table of ‘Summary of the Actual Partial Information Content’ and fill in the
details to find the number of causes and PIC (s)in each level of the matrix so the range of
values for PIC is ascertained. Annexure: SN
Fill in the details of Scales for each level in the table of ‘Summary of Actual PIC for each
Draw conclusion about the category of complexity of the design activity on the basis of
the grand total of the partial information content (GTPIC).
1. With reference to step: 4 the ‘Causes’ are identified as:
(III, A)
(IV, B)
(III, C)
2. As per step: 5 go to the tables of various assessment scales and fill in the details of
the causes and your ideas about the various factors of the three scales. For
example for the “Assessment on Technicality Scale” for the causes mentioned in
(1) the column number III and IV would be filled in by the operator for factors of
the technicality scale, say for cause (III, A) is st 2 which is 16 and denotes that
‘worker is able to make a partial use of his skills’ similarly for cause (IV, B)
operator has assigned as st4 which is 40 denoting that the ‘worker is unable to use
his skills’ in spite of possessing them. Similarly for the cause (III, C) operator has
assigned as st4, which is 32 denoting that the worker is unable to use the skills. So
the total value of st for level III is 48 and that for IV is 40
Similarly these causes are assessed for sa and sd and the total values for sa for level
III is 40 and that for IV is 30 and for sd the total values for level III is 40 and that
for IV is again 30
Go to the table of ‘Summary of Assessment of Causes on different scales’ as per
step: 6 and fill in the details to find the contribution of the partial information
content associated with each cause identified in Step: 1. Annexure: SN
As per step: 7 go the table ‘Summary of the Actual Partial Information Content’.
As in this example level III has the actual PIC of 128 on account of two causes so
the range of PIC for this should be <=240 and that for level IV having the actual
PIC as 100 the range of PIC should be <= 150.
As per step: 8 go to the table of ‘Summary of Actual Partial Information Content
(PIC) for each level’ and fill in the details to find the Grand Total of the PIC
(GTPIC), which in the present case is 228. Annexure: SN
As per step: 9 draws conclusion on the category of the GTPIC obtained in (6)
Total of PIC at each level
Grand Total of PIC (GTPIC) = 228 (Highly Complex)
Complexity of the design activity has been divided into different categories A, B, C and
D respectively from extreme complex to simple depending on the value of its grand total
Extremely Complex 360 < GTPIC < = 450
Highly Complex
Medium Complex
180 < GTPIC < = 360
90 < GTPIC < = 180
30 < GTPIC < = 90
1. The above table summarising the final partial information content for each level
denotes that the PIC in level III is more than the PIC in level IV. This helps the
user to make efforts to reduce the effect on account of the ‘CAUSES’ at level III
than at level IV.
2. In spite of the fact that the user has arranged the ‘CAUSES’ in an increasing order
of its information content the results shows that it is not true rather the
information content is dependent on the degree of assessment of the factors
considered within the different scales.
3. The individual causes identified in it or different levels help to draw comparison
amongst them on the basis its actual information content as shown in the table of
summary of the actual partial information content (PIC).
1. Methodology can be applied to any design activity, which makes use of the skills
of the operators.
2. Methodology could be applied as long as the IPs/States/Solution Steps can be
3. User dependent so the complexity Index is dynamic.
4. Methodology is not taking into account the ergonomics of the surrounding
5. Accuracy of the results is dependent on the observer and the data used in finding
the states.
Norris, K.P., “The accuracy of project cost and duration estimates in Industrial R & D”, R
& D Management, 2, pp 25-36, 1971.
DeMarco, T., “Controlling Software Projects”, Yourdon press, New York, 1982.
Bennett, J. (1991). International Construction Project Management: General Theory and
Practice. Butterworth – Heinemann, Oxford
Duffy AHB, Duffy SM, Andreasen MM, “Using Design Complexities in validating the
Design Coordination Framework (Unpublished Report, DMEM, University of
Strathclyde, Glasgow, UK)
Murmann, P.A., “ Expected development time reductions in the German Mechanical
Industry”, Journal Product Innovation Management, 11, pp 236-252, 1994.
Wiendahl, H.P., Scholtissek, P., “ Management and Control of Complexity in
Manufacturing”, Keynote Papers, Annals of the CIRP Vol. 43/2/1994.
Chryssolouris, G; “ Measuring complexity in manufacturing systems” Working paper,
Department of Mechanical Engineering and Aeronautics, University of Patras 26110
Greece (1994)
Frizelle, G. and Woodcock, E. (1995). “Measuring Complexity as an Aid to developing
operational strategy”, Int. Journal of Operations and Production Management, 15(5), pp.
Frizelle, G.D.M. (1995) “An entropic measurement of complexity in Jackson networks,
working paper no.13. Manufacturing Engineering Group, Department of Engineering,
University of Cambridge, UK.
Baccarini, D., “ The Concept of Project Complexity- a review”, International Journal of
Project Management, 14, 201-204, 1996.
Gidado, K.I., “Project Complexity: The focal point of construction production planning”,
Construction Mangement and Economics, Vol. 14, pp 213-225, 1996.
Little G, Tuttle R, Clark D E R, Corney J, “A Feature Complexity Index”, Proc Instn
Mechanical Engineers Vol 212, Part C, 1998, pp 405
Salingaros Nikos (1997) “A, Life and Complexity in Architecture from a
thermodynamic analogy”, Physics essays, volume 10, 1997, pages 165-173.
Frizelle, G. (1998). “The Management of Complexity in Manufacturing”, Business
Intelligence, London.
Frizelle, G., Gregory, M. J., “ Complexity and the Impact of Introduction new Products”,
International Conference on Complexity and Complex Systems in Industry, University of
Warwick, UK, 19-20 September, 2000.
Calinescu, A., Efstathiou, J., Sivadasan, S., Schirn, J., Huaccho Huatuco, L.,
“Complexity in Manufacturing: An Information Theoretic Approach”, International
Conference on Complexity and Complex Systems in Industry, University of Warwick,
UK, September 19-20, 2000.
Frizelle, G., Gregory, M. J., “ Complexity and the Impact of Introduction new Products”,
International Conference on Complexity and Complex Systems in Industry, University of
Warwick, UK, 19-20 September, 2000.
Laurikkala H, Puustinen E, Pajarre E, Tanskanen K, “Reducing complexity of modelling
in large delivery projects”, Proceedings of the Int. Conf. On engineering design, ICED 01
Glasgow, August 21-23, (2001) pp 166-167
Earl C, Johnson J, Eckert CM, “Complexity of Planning Design Processes”, ASME
Design Theory and Methodology Conference, Paper, DTM-21697, Pittsburgh 9-12
September, 2001
Kusiak Andrew and Wang Juite (Ray), “Qualitative Analysis of the design process”,
Intelligent Concurrent Design: Fundamentals, Methodology, Modelling and Practice”,
ASME 1993
View publication stats