Welcome
Digital library of construction informatics
and information technology in civil engineering and construction
 

Works 

Search Results

Facilitated by the SciX project

Hits 21 to 30 of 33

Kumar V S S, Hanna A S, Natarajan P

Application of fuzzy linear programming in construction projects

Abstract: In classical optimization model, the objective function and the constraints are represented very precisely under certainty. However, many of the constraints are externally controlled and the variations cannot be predicted to a reliable extent. This may cause difficulties in representing these interacting variables for optimization. To overcome these limitations, Zimmerman's fuzzy logic approach is applied for optimization in this paper. Here, the embedding simulation results are used as inputs to a fuzzy linear programming model to soften the notion of "constraints" and "objective function." This approach will acknowledge and postulate that the objective function and the constraints are of the same nature and the distinction between them is gradual rather than abrupt. An application of this integrated approach to a case study demonstrates the efficacy of this flexible algorithm in dealing with qualitative factors in a more meaningful way than classical linear programming. One of the main advantages of this method is that it can be easily implemented in the existing computer programs for optimization.

Keywords: finear programming, fuzzy sets, tolerance limits, fuzzy goals, fuzzy constraints

DOI:

Full text: content.pdf (325,376 bytes) (available to registered users only)

Series: itaec:2004 (browse)
Cluster:
Class:
Similar papers:
Sound: N/A.


Laptali E, Bouchlaghem N, Wild S,

A computer model for time and cost optimisation during pretenderstage

Abstract: An integrated computer model for the evaluation of different project duration/costsolutions during pre-tender and pre-contract stages has been developed for multi-storeyreinforced concrete office buildings. The model performs two processes; simulation andoptimisation. The optimisation part, which is the subject of this paper, uses data provided bythe simulation part to determine sets of time vs. cost solutions. The model takes account ofthe precedence relationships, the lag values, and the normal and crash values of time and costfor activities. Linear programming is used to solve the optimisation problem. Minimumincrease in the project direct cost when the project duration is accelerated is achieved by theminimisation of the objective function. The model has been validated by checking theoptimisation process and the validity of the theoretical basis using a hypothetical six storeyreinforced concrete office building. In addition the model has been reviewed by constructionpractitioners using the same hypothetical building to check the validity of the results.

Keywords: Computer modelling, optimisation, time/cost curves, linear programming,Simplex Algorithm

DOI:

Full text: content.pdf (87,205 bytes) (available to registered users only)

Series: w78:1997 (browse)
Cluster:
Class:
Similar papers:
Sound: N/A.


Manfred Breit, Manfred Vogel, Fritz Ha_ubi, Fabian Ma_rki, Marco Soldati, La_szlo_ Istva_n Etesi, Nicky Hochmuth, Andreas Walther

ENHANCEMENT OF VIRTUAL DESIGN AND CONSTRUCTION METHODS

Abstract: In this paper we report about a three-tier applied R&D approach for the Enhancement of Virtual Design and Construction methods at the Institute of 4D Technologies UAS, North¬¬western Switzerland (i4Ds). In collaboration with the CAD vendor and developer cadwork informatik AG our research focuses on technology, its intro¬duction into the market the effects and difficulties of the tool use and the induced process changes. We will describe the methodlogy, the expected outcomes of the enhancements, the research approach, initial findings and the further proceedings.In the first tier cadworks introduces an intuitive integrated 4D modeler called (LexoCad or Baubit CAD) for contractors which is commercially available since one year. Analogue to playing with building blocks users create 3D building models and 4D phasing models for the construction of the building directly from 2D pdf drawings. The expected outcome are that the virtual building blocks serve as a test-bed for constructability analyzes, enhanced planning reliability, better coordination and communication, optimized procurement and wide-spread use in practice. The next two tiers of VDC enhancements are currently developed at i4Ds. For the second tier we introduce a semantic, flexible, database-backed, object-oriented data structure for hierarchically structured Product, Organization and Process models (POP models) with an enhanced intuitive 3D/4D graphical user interface for the rapid generation of design alternatives. Users can easily propagate information to related property sets of construction elements and assemblies. Behavior methods (scripts) can be assigned for a variety of tasks e.g. BOM creation, construction method modeling, creation of cost performance predictions etc. This approach technology-wise moves the model management from the modeler or viewer components to the data base domain. The flexible hierarchies not only allow users to manually restructure and rearrange the model to their needs but enable automatic AI optimizers to even alter the construction method e.g. timber element, precast concrete or masonry walls etc. The expected outcomes are a pro-active 4D planning, rapid generation, comparison and evaluation of POP- design- alternatives, derivation von case from existing designs, easy and effective integration of client information into POP models, creating performance predictions (quality, time, cost, risk, etc.) from this models, easy creation of 4D sub-models for knowledge transfer for inter-disciplinary cooperation.In the third tier we introduce a novel process design concept which we named Process Design Patterns (PDPs). They are based on Christopher Alexander's (1977) concept of design pattern as a formal way of documenting successful solutions to problems and as templates describing how to solve problems of a particular domain. In a study, we called Process Archeology, we chose a recently finished four storey residential concrete building and reconstructed and re-modeled the over-all building processes with an inter-disciplinary team. Therefore we created the necessary 3D-, 4D- and process- and organization- models with commercial available modeling tools. We were able to derive one generic and seven specific PDPs for the whole erection of the building. We describe a strategy to apply PDPs directly on 3D building information models (BIM) to automate and optimize the planning process.

Keywords: Virtual Design and Construction, 4D Modeling, Product, Process and Organization Modeling and Simulation, Process Design Patterns

DOI:

Full text: content.pdf (386,476 bytes) (available to registered users only)

Series: w78:2008 (browse)
Cluster:
Class:
Similar papers:
Sound: N/A.


Patel M B, McCarthy T J, Morris P W, Elhag T M

The role of it in capturing and managing knowledge for organisational learning on construction projects

Abstract: "Knowledge management is becoming increasingly recognised as a critical source of competitive advantage. The way organisations use knowledge and learn is increasingly being recognised as central to performance improvement. Construction is no exception. Many construction companies, and their clients, are recognising that the way they manage knowledge and learn, across the whole supply chain, can make an enormous difference to their performance and the efficiency of the construction process. This paper describes work forming part of the research project: ‘The Role of IT in Capturing and Managing Knowledge for Organisational Learning on Construction Projects’ – known now under the acronym KLICON: Knowledge and Learning In CONstruction. It sets the scene for the detailed research project reviewing the current state of the use if IT in knowledge management and organisational learning in the construction industry. The problem is in many ways particularly difficult and important in construction with its project base, and the large number of often relatively small projects with constantly changing members of the supply chain. Information Technology (IT) offers real opportunities for capturing knowledge and feeding it back into the project organisation. This is important if performance is really to improve. This research will examine how IT can better assist knowledge management and organisational learning in construction projects. The aim of the research is to investigate how Information Technology can facilitate organisational learning and knowledge management in the construction industry. This will be achieved by: · examining how knowledge is captured and managed by firms working on construction projects; · assessing what management and IT tools are used to facilitate this, and their effectiveness. Knowledge needs and the use of IT tools will be investigated within a selected domain. This will be Requirements Capture and Management. In KLICON, knowledge is being taken as the cognitive ability to generate insight based on information and data. Much of the current work in knowledge management focuses on the collection, classification, storage, accessing and communication of information. Important though this is, many organisations are increasingly recognising that the way information is used in order to facilitate continuous improvement is often of more immediate relevance. This, broadly, is the area of organisational learning. Organisational learning is the ability of the organisation to collect and use information so that members exploit it to learn and to improve performance. Learning is something that pervades every individual’s life in one form or another. Organisations may be capable of learning and such organisational learning may in turn impact upon various aspects of an organisation’s performance. The full paper will amplify the topics outlined above and illustrate them with examples from the construction organisations from the KLICON group. It will also include examples of the IT tools that are being used to capture the process functions and the related information requirements. The KLICON industrial partners, Ove Arup and Partners and Kvaerner Construction Ltd, are providing access to project teams for the in-depth research into requirements capture, knowledge transmission and organisational learning."

Keywords:

DOI:

Full text: content.pdf (332,916 bytes) (available to registered users only)

Series: w78:2000 (browse)
Cluster: papers of the same cluster (result of machine made clusters)
Class: class.education (0.060951) class.deployment (0.056432) class.environment (0.019154)
Similar papers:
Sound: read aloud.

Permission to reproduce these documents have been graciously provided by Icelandic Building Research Institute. The assistance of the editor, Mr. Gudni Gudnason, is gratefully appreciated


Pilgrim M, Bouchlaghem D, Holmes M, Loveday D

Visualisation in building design and analysis

Abstract: "Research on data visualisation is undergoing major developments in a number of different fields. These developments include investigating ways of applying visualisation techniques and systems for more efficient manipulation, interpretation and presentation of data. Research into applied visualisation has so far taken place in the fields of Computational Fluid Dynamics, Medicine, Social Sciences, and the Environment. In the built environment field however, the potential of new visualisation technologies to enhance the presentation of performance data from simulation programmes (of the type used by engineering design consultants, for example) has remained almost unexplored. Improvements in this area would lead to a better and more efficient use of these simulation programs and would facilitate the interpretation of such output data by construction industry professionals, leading to better, more informed design decisions. This paper presents an initial study on Data Visualisation and its effective use in the thermal analysis of buildings. Much of the current data visualisation in the engineering and scientific world focuses on very large data sets produced by applications such as FEA, CFD or GIS. As such the tools developed to date are often too expensive or not appropriate for the visualisation of the relatively smaller data sets produced by thermal analysis tools. The objective of the work summarised here was to develop a method of visualising the data produced by the thermal analysis tools which would run on an average desktop PC and be easy to maintain/customise and above all present the data in an intuitive manner. A workplace observational study of several engineers performing such an analysis revealed each was spending a significant amount of time manipulating the output within commercial spreadsheet packages. Further studies revealed the most common tasks were the inspection of predicted internal conditions, location of glazed elements transmitting significant solar radiation and the identification of high internal surface temperatures. Two applications were therefore proposed. The first is designed to automatically process the output within the spreadsheet environment. The second is designed to display the solution in three dimensions to aid spatial recognition and data navigation. The spreadsheet tools were developed over a period of several months and then released to all users of the analysis tools. The 3D tool was developed over a longer period and has been subjected to small group tests. Each tool was developed using Microsoft Visual Basic making them both easy to maintain and freely available. The 3D tool reads in flat text files produced by the analysis and automatically generates a framed HTML page with an embedded 3D VRML world describing the building and its results. This study shows that each of the proposed applications significantly improves some of the attributes associated with usability, namely; learnability, efficiency, memorability, errors and satisfaction. The spreadsheet tool increased efficiency and decreased errors but offered no real satisfaction. The 3D tool offers increased satisfaction but at present does not efficiently present all of the data required. Finally, It is possible to develop low cost Data Visualisation tools to improve the overall usability of a thermal analysis tool within a built environment consultantcy."

Keywords:

DOI:

Full text: content.pdf (404,505 bytes) (available to registered users only)

Series: w78:2000 (browse)
Cluster: papers of the same cluster (result of machine made clusters)
Class: class.social (0.027102) class.environment (0.018138) class.economic (0.016196)
Similar papers:
Sound: read aloud.

Permission to reproduce these documents have been graciously provided by Icelandic Building Research Institute. The assistance of the editor, Mr. Gudni Gudnason, is gratefully appreciated


Robert Amor, Johannes Dimyadi

An Open Repository of IFC Data Models and Analyses to Support Interoperability Deployment

Abstract: Several significant projects have developed resources to support various aspects of IFC use. These projects run the gamut of: counting entities within IFC data files; providing summary statistics for IFC data files; providing metrics for information with IFC data files; determining the syntactic correctness of IFC data files; determining redundancies within IFC data files; visualizing aspects of IFC data files; navigating IFC data files; etc. In this project, rather than duplicating the functionalities provided by tools developed in any particular project, the aim was to integrate the disparate resources developed across the world to provide a central point of access to the support resources for the field.For users this provides the ability to generate significant analysis of a submitted IFC data file across the major testing systems developed worldwide. While there are many overlapping results in the resultant data sets, there is also significantly different information provided by different tools. Users are provided with the ability to invoke several visualizations and data investigations for their IFC data file, and also are provided with a summary of major statistics drawn from the analyses.For developers this includes a repository of standard IFC data files from many projects, across many versions, and of many levels of sophistication. Several of the data files contain known errors. All of the data files have been processed by the range of analysis tools linked through the support harness to provide a comparative resource when testing a new tools.This project required a meta-representation of IFC data file analysis and associated metrics in order to collate and integrate the results drawn from multiple analysis tools. It also required a similar structure to manage standard IFC data files and the stored analysis of results from the range of tools that tested it.

Keywords: IFC, open repository, data models, analysis

DOI:

Full text: content.pdf (581,635 bytes) (available to registered users only)

Series: w78:2010 (browse)
Cluster:
Class:
Similar papers:
Sound: N/A.


Robert Lipman

Developing Coverage Analysis for IFC Files

Abstract: Conformance and interoperability testing of product data exchange interfaces is essential to delivering reliable software implementations and meeting user’s expectations. For either type of testing, product data model test files, such as IFC files, are required to test the import and export capabilities of IFC interfaces in software applications. However, the vast extent of information concepts in the IFC schema makes it infeasible to generate a set of test files to provide comprehensive coverage of all concepts and their combinations. Currently industry is using sets of IFC test files that have been contributed by multiple organizations to test data exchange implementations. Therefore, given the sets of test files that are used for testing, it is important to be able to measure and document the coverage of information concepts that are contained in the files. The coverage analysis of a set of test files can be based on many metrics. Coverage analysis metrics can be based on concepts that are generic to all files, such as the use of property sets, enumerations, geometric representations, and commonly used optional attributes. Metrics for coverage analysis can also be based on concepts specific to a particular domain or model view definition such as precast concrete or energy simulation. Software is being developed to implement various metrics related to the coverage analysis concepts and applied to sets of IFC files, such as those used in the past buildingSMART IFC certification process and the current model view definitions developed for IFC implementations. Ultimately, the results of coverage analysis will determine if a set of test files used provides sufficient coverage of all the relevant concepts that need to be tested.

Keywords: IFC, coverage analysis, software testing, conformance, interoperability

DOI:

Full text: content.pdf (1,887,507 bytes) (available to registered users only)

Series: w78:2010 (browse)
Cluster:
Class:
Similar papers:
Sound: N/A.


Sacks R

Adding intelligence to project model objects to integrate AEC information

Abstract: A number of trends in Information Technology (IT) indicate that the challenge of integrating information for construction projects may be best served by tight coupling of application methods with project model objects. A paradigm is proposed in which designers, contractors and suppliers process project information, which is stored on a central server, using application routines which reside with the data objects in the central server. The use of Intelligent Parametric Templates (IPT) in three sample applications is described. These demonstrate how Building Project Models (BPM) can support this vision of Computer Integrated Construction (CIC). Development of such “intelligent” object sets will be financed by construction component suppliers; they will be based on foundation class standards, programmed by software developers, and provided on-line by project management web sites. The obstacles and research requirements relating to the proposed paradigm are discussed, in light of the experience gained in development and use of the IPTs.

Keywords:

DOI:

Full text: content.pdf (512,233 bytes) (available to registered users only)

Series: w78:2001 (browse)
Cluster: papers of the same cluster (result of machine made clusters)
Class: class.processing (0.045608) class.software development (0.023840) class.roadmaps (0.022685)
Similar papers:
Sound: read aloud.

Permission to reproduce these documents have been graciously provided by CSIR Building and Construction Technology. The assistance of the editors, Mr. Gustav Coetzee and Mr. Frances Boshoff, is gratefully appreciated.


Shah R,Dawood N,Castro S

Automatic generation and visualization of location-based scheduling

Abstract: Accurate and visual information of working locations is vital for efficient resource planning and location-based scheduling of earthworks, which is missing in existing linear schedules. Thus, construction managers have to depend on subjective decisions for resources allocation and progress monitoring from location aspects. This has caused uncertainties in planning and scheduling, and consequently delays and cost overruns of projects. A framework of prototype model was developed using the theory of location-based planning to overcome the above issues. This paper focuses on a case study experiments to demonstrate the functions of the model, which includes automatic generation of location-based earthwork schedules and visualization of cut-fill locations on a weekly basis. An arithmetic algorithm was developed by incorporating road design data, sectional quantities, variable productivity rates, unit cost and haulage distance. The model provides weekly information of locations, directions and cut-fill quantities of earthwork under different selections: construction sequences of cut/fill sections, site access points and equipment sets. The paper concludes that the model assists in identifying the correct locations and visualizing the space congestion during earthwork operations. Hence, project resources including heavy equipment and construction materials should be allocated more effectively and correctly from the location viewpoints and ultimately to improve site productivity and reduce production cost in linear projects.

Keywords: Earthworks,Cut-fill quantity,Location-based scheduling,Productivity,Visualization

DOI:

Full text: content.pdf (1,723,255 bytes) (available to registered users only)

Series: convr:2013 (browse)
Cluster:
Class:
Similar papers:
Sound: N/A.


Suraj Ravindran, Prakash Kripakaran, Ian F. C. Smith

Evaluating reliability of multiple-model system identification

Abstract: This paper builds upon previous work by providing a statistical basis for multiple-model system identifica-tion. Multiple model system identification is useful because many models representing different sets of modeling as-sumptions may fit the measurements. The presence of errors in modeling and measurement increases the number of possible models. Modeling error depends on inaccuracies in (i) the numerical model, (ii) parameter values (constants) and (iii) boundary conditions. On-site measurement errors are dependent on the sensor type and installation condi-tions. Understanding errors is essential for generating the set of candidate models that predict measurement data. Pre-vious work assumed an upper bound for absolute values of composite errors. In this paper, both modeling and meas-urement errors are characterized as random variables that follow probability distributions. Given error distributions, a new method to evaluate the reliability of identification is proposed. The new method defines thresholds at each meas-urement location. The threshold value pairs at measurement locations are dependent on the required reliability, char-acteristics of sensors used and modeling errors. A model is classified as a candidate model if the difference between prediction and measurement at each location is between the designated threshold values. A timber beam simulation is used as example to illustrate the new methodology. Generation of candidate models using the new objective function is demonstrated. Results show that the proposed methodology allows engineers to statistically evaluate the performance of system identification.

Keywords: system identification, multiple models, error characterization, reliability, measurements, model predic-tion

DOI:

Full text: content.pdf (1,262,641 bytes) (available to registered users only)

Series: w78:2007 (browse)
Cluster:
Class:
Similar papers:
Sound: N/A.


For more results click below:

 

hosted by University of Ljubljana



includes

W78




© itc.scix.net 2003
show page 1 show page 2 this is page 3 show page 4 Home page of this database login Powered by SciX Open Publishing Services 1.002 February 16, 2003