關節(jié)型機械手設計【四自由度 圓柱坐標式液壓驅動】【CAD圖紙和文檔終稿可編輯】
關節(jié)型機械手設計【四自由度 圓柱坐標式液壓驅動】【CAD圖紙和文檔終稿可編輯】,四自由度 圓柱坐標式液壓驅動,CAD圖紙和文檔終稿可編輯,關節(jié)型機械手設計【四自由度,圓柱坐標式液壓驅動】【CAD圖紙和文檔終稿可編輯】,關節(jié),機械手,設計,自由度,圓柱,坐標,液壓,驅動,cad,圖紙
1
畢業(yè)設計(論文)任務書
系 部
機械工程系
指導教師
謝衛(wèi)容
職 稱
學生姓名
張安懷
專業(yè)班級
05機制(1)班
學 號
0515011127
設計題目
關節(jié)型機械手設計
設
計
內
容
目
標
和
要
求
(設計內容目標和要求、設計進度等)
內容:
1、關節(jié)型機械手的總體結構設計
2、機械手各關節(jié)傳動系統(tǒng)設計
3、機械手控制系統(tǒng)設計
4、繪制裝配圖,零件圖,以及傳動和控制系統(tǒng)原理圖
要求:
1、收集資料(相關書籍5本以上,文章10篇以上);
2、繪制零件圖,裝配圖,原理圖;
3、撰寫論文要符合要求;
4、翻譯3000字左右的英文文章一篇
進度:
1、1-6周,主要進行畢業(yè)設計準備工作,熟悉題目,收集資料,進行畢業(yè)實習,明確研究目的和任務,構思總體方案;
2、7-10周,繪制相關的圖;
3、11-13周,收尾完善,編寫畢業(yè)設計論文,準備畢業(yè)設計答辯。
指導教師簽名:謝衛(wèi)容
年 月 日
系 部審 核
此表由指導教師填寫 由所在系部審核
2-1
畢業(yè)設計(論文)學生開題報告
課題名稱
關節(jié)型機械手設計
課題來源
生產實踐
課題類型
AX
指導教師
謝衛(wèi)容
學生姓名
張安懷
學 號
0515011127
專業(yè)班級
05機制(1)班
本課題的研究現(xiàn)狀、研究目的及意義
本課題的研究現(xiàn)狀、研究目的及意義
研究現(xiàn)狀:
機械手是近幾十年發(fā)展起來的一種高科技自動化生產設備。它的特點是可通過編程來完成各種預期的作業(yè)任務,在構造和性能上兼有人和機器各自的優(yōu)點,尤其體現(xiàn)了人的智能和適應性。機械手作業(yè)的準確性和各種環(huán)境中完成作業(yè)的能力,在國民經濟各領域有著廣闊的發(fā)展前景。
我國國家標準(GB/T 12643-90)對機械手的定義:“具有和人手臂相似的動作功能,可在空間抓放物體,或進行其它操作的機械裝置?!睓C械手可分為專用機械手和通用機械手兩大類。專用機械手:它作為整機的附屬部分,動作簡單,工作對象單一,具有固定(有時可調)程序,使用大批量的自動生產。如自動生產線上的上料機械手,自動換刀機械手,裝配焊接機械手等裝置。通用機械手:它是一種具有獨立的控制系統(tǒng)、程序可變、動作靈活多樣的機械手。它適用于可變換生產品種的中小批量自動化生產。它的工作范圍大,定位精度高,通用性強,廣泛應用于柔性自動線。
機械手最早應用在汽車制造工業(yè),常用于焊接、噴漆、上下料和搬運。機械手延伸和擴大了人的手足和大腦功能,它可替代人從事危險、有害、有毒、低溫和高熱等惡劣環(huán)境中的工作;代替人完成繁重、單調重復勞動,提高勞動生產率,保證產品質量。目前主要應用于制造業(yè)中,特別是電器制造、汽車制造、塑料加工、通用機械制造及金屬加工等工業(yè)。機械手與數控加工中心,自動搬運小車與自動檢測系統(tǒng)可組成柔性制造系統(tǒng)(FMS )和計算機集成制造系統(tǒng)(CIMS ),實現(xiàn)生產自動化。隨著生產的發(fā)展,功能和性能的不斷改善和提高,機械手的應用領域日益擴大。
研究目的及意義:
工業(yè)機械手具有許多人類無法比擬的優(yōu)點,滿足了社會化大生產的需要,其主要優(yōu)點如下:
1.能代替人從事危險、有害的操作。只要根據工作環(huán)境進行合理設計,選擇適當的材料和結構,機械手就可以在異常高溫或低溫、異常壓力和有害氣體、粉塵、放射線作用下,以及沖壓、滅火等危險環(huán)境中勝任工作。工傷事故多的工種,如沖壓、壓鑄、熱處理、鍛造、噴漆以及有強烈紫外線照射的電弧焊等作業(yè)中,應推廣工業(yè)機械手或機器人。
2.能長時間工作,不怕疲勞,可以把人從繁重單調的勞動中解放出來,并能擴大和延伸人的功能。人在連續(xù)工作幾小時后,總會感到疲勞或厭倦,而機械手只要注意維護、檢修,即能勝任長時間的單調重復勞動。
3.動作準確,因此可以穩(wěn)定和提高產品的質量,同時又可避免人為的操作錯誤。
4.機械手特別是通用工業(yè)機械手的通用性、靈活性好,能較好地適應產品品種的不斷變化,以滿足柔性生產的需要。
5.機械手能明顯地提高勞動生產率和降低成本。
由于機械手在工業(yè)自動化和信息化中發(fā)揮了以上巨大的作用,世界各國都很重視工業(yè)機械手的應用和發(fā)展,機械手的應用在我過還屬于起步階段,就顯示出了許多的無法替代的優(yōu)點,展現(xiàn)了廣闊的應用前景。近十幾年來,機械手的開發(fā)不僅越來越優(yōu)化,而且涵蓋了許多領域,應用的范疇十分廣闊。
關節(jié)型機械手占用空間小,工作范圍大,慣性小,所需動力小,能抓取底面物體,并且還可以繞障礙物選擇途徑,所以關節(jié)型機械手的研究具有很重要的意義。
課題類型:
(1)A—工程實踐型;B—理論研究型;C—科研裝置研制型;D—計算機軟件型;
E—綜合應用型
(2)X—真實課題;Y—模擬課題;
(1)、(2)均要填,如AY、BX等。
2-2
本課題的研究內容
1) 擬定整體方案,特別是傳感器、控制方式與機械本體的有機結合的設計方案。
2) 根據給定的自由度和技術參數選擇合適的手部、腕部和機身的結構。
3) 各部件的設計計算。
4) 工業(yè)機械手工作裝配圖的設計與繪制。
5) 液壓系統(tǒng)圖的設計與繪制。
6) 電器控制圖()的繪制。
7) 編寫設計計算說明書。
本課題研究的實施方案、進度安排
實施方案:
收集資料,擬定整體結構設計,根據給定的自由度和技術參數選擇合適的手部、腕部和機身的結構。
進度安排:
1).1-6周,主要進行畢業(yè)設計準備工作,熟悉題目,收集資料,進行畢業(yè)實習,明 確研究目的和任務,構思總體方案
2).7-10周,繪制相關的圖
3).11-13周,收尾完善,編寫畢業(yè)設計論文,準備畢業(yè)設計答辯
2-3
已查閱的主要參考文獻
1.專著
[1] 李允文.工業(yè)機械手設計[M].北京:機械工業(yè)出版社,1996
[2] 加藤一郎.機械手圖冊[M]. 上海:上??茖W技術出版社,1979
[3] 第一機械工業(yè)部機械研究院機電研究所編.工業(yè)機械手圖冊.北京:第一機械工業(yè)部機械研究院機電研究所,1976
2.論文集
[1] 付亞子.機械手控制系統(tǒng)[C].湖北:湖北工業(yè)大學,2006
[2] 通用機械手設計[C].湖北:湖北工業(yè)大學,2006
指導教師意見
指導教師簽名:
年 月 日
3
畢業(yè)設計(論文)學生申請答辯表
課 題 名 稱
關節(jié)型機械手設計
指導教師(職稱)
謝衛(wèi)容
申 請 理 由
已按照指導教師的要求完成設計,特申請答辯
學生所在系部
機械工程系
專業(yè)班級
05機制(1)班
學號
0515011127
學生簽名: 日期:
畢業(yè)設計(論文)指導教師評審表
序號
評分項目(理工科、管理類)
評分項目(文科)
滿分
評分
1
工作量
外文翻譯
15
2
文獻閱讀與外文翻譯
文獻閱讀與文獻綜述
10
3
技術水平與實際能力
創(chuàng)新能力與學術水平
25
4
研究成果基礎理論與專業(yè)知識
論證能力
25
5
文字表達
文字表達
10
6
學習態(tài)度與規(guī)范要求
學習態(tài)度與規(guī)范要求
15
總 分
100
評
語
(是否同意參加答辯)
指導教師簽名:
另附《畢業(yè)設計(論文)指導記錄冊》 年 月 日
4
畢業(yè)設計(論文)評閱人評審表
學生姓名
張安懷
專業(yè)班級
機制(1)班
學號
0515011127
設計(論文)題目
關節(jié)型機械手設計
評閱人
評閱人職稱
序號
評分項目(理工科、管理類)
評分項目(文科)
滿分
評分
1
工作量
外文翻譯
15
2
文獻閱讀與外文翻譯
文獻閱讀與文獻綜述
10
3
技術水平與實際能力
創(chuàng)新能力與學術水平
25
4
研究成果基礎理論與專業(yè)知識
論證能力
25
5
文字表達
文字表達
10
6
學習態(tài)度與規(guī)范要求
學習態(tài)度與規(guī)范要求
15
總 分
100
評
語
評閱人簽名:
年 月 日
5
畢業(yè)設計(論文)答辯表
學生姓名
張安懷
專業(yè)班級
05機制(1)班
學號
0515011127
設計(論文)題目
關節(jié)型機械手設計
序號
評審項目
指 標
滿分
評分
1
報告內容
思路清新;語言表達準確,概念清楚,論點正確;實驗方法科學,分析歸納合理;結論有應用價值。
40
2
報告過程
準備工作充分,時間符合要求。
10
3
創(chuàng) 新
對前人工作有改進或突破,或有獨特見解。
10
4
答 辯
回答問題有理論依據,基本概念清楚。主要問題回答準確,深入。
40
總 分
100
答
辯
組
評
語
答辯組組長(簽字): 年 月 日
答
辯
委
員
會
意
見
答辯委員會負責人(簽字): 年 月 日
6-1
畢業(yè)設計(論文)答辯記錄表
學生姓名
張安懷
專業(yè)班級
05機制(1)班
學號
0515011127
設計(論文)題目
關節(jié)型機械手設計
答辯時間
答辯地點
答辯委員會名單
問題1
提問人:
問題:
回答(要點):
問題2
提問人:
問題:
回答(要點):
問題3
提問人:
問題:
回答(要點):
記錄人簽名
(不足加附頁)
6-2
問題4
提問人:
問題:
回答(要點):
問題5
提問人:
問題:
回答(要點):
問題6
提問人:
問題:
回答(要點):
問題7
提問人:
問題:
回答(要點):
問題8
提問人:
問題:
回答(要點):
記錄人簽名
7
畢業(yè)設計(論文)成績評定總表
學生姓名:張安懷 專業(yè)班級:05機制(1)班
畢業(yè)設計(論文)題目:關節(jié)型機械手設計
成績類別
成績評定
Ⅰ指導教師評定成績
Ⅱ評閱人評定成績
Ⅲ答辯組評定成績
總評成績
Ⅰ×40%+Ⅱ×20%+Ⅲ×40%
評定等級
注:成績評定由指導教師、評閱教師和答辯組分別給分(以百分記),最后按“優(yōu)(90--100)”、“良(80--89)”、“中(70--79)”、“及格(60--69)”、“不及格(60以下)”評定等級。其中,
指導教師評定成績占40%,評閱人評定成績占20%,答辯組評定成績占40%。
畢 業(yè) 設 計(論 文)
關節(jié)型機械手設計
學生姓名:
張安懷
學 號:
0515011127
所在系部:
機械工程系
專業(yè)班級:
05機制(1)班
指導教師:
謝衛(wèi)容
日 期:
二○○九年六月
Joint-based Robot Design
By
Zhang Anhuai
June 2009
畢 業(yè) 設 計(論 文)
英 文 文 獻 翻 譯
學生姓名:
張安懷
學 號:
0515011127
所在系部:
機械工程系
專業(yè)班級:
05機制(1)班
指導教師:
謝衛(wèi)容
日 期:
二○○九年六月
摘 要
本文設計的關節(jié)型機械手采用圓柱坐標式,能完成上料、翻轉等功能。此機械手主要由手爪、手腕、手臂和機身等部分組成,具有手腕回轉、手臂伸縮、手臂升降和手臂回轉4個自由度,能夠滿足一般的工業(yè)要求。
該機械手由電位器定位,實行點位控制,控制系統(tǒng)采用PLC可編程控制,具有良好的通用性和靈活性。
該機械手為液壓驅動,4個自由度和手爪的夾緊都由液壓缸驅動,在油路的布置和規(guī)劃中結合機械制造的基礎,不斷使油路符合制造的可行性,而且將油路布置成空間結構,使機械手的結構更加簡潔和緊湊。
關鍵字:關節(jié)型機械手 圓柱坐標 液壓缸 可編程控制
Abstract
In this paper, the design of the joint-type robot using cylindrical coordinates of type, can be completed on the expected, inversion and other functions. Mainly by the manipulator hand, wrist, arm and body parts, etc., with rotating wrists, arms stretching, arm movements and arm rotation four degrees of freedom, able to meet the general requirements of the industry.
The manipulator by the potentiometer position, the implementation of the control points, the control system using PLC programmable control, has a good generality and flexibility.
The manipulator for the hydraulic-driven, four degrees of freedom and the clamping gripper driven by the hydraulic cylinder in the circuit layout and planning based on the combination of machinery manufacturing, and continuously so that the feasibility of manufacturing in line with the circuit, but also circuit layout into a spatial structure, so that the structure of manipulator more concise and compact.
Keywords: joint-type robot cylindrical coordinates hydraulic cylinders PLC.
摘 要
本文設計的關節(jié)型機械手采用圓柱坐標式,能完成上料、翻轉等功能。此機械手主要由手爪、手腕、手臂和機身等部分組成,具有手腕回轉、手臂伸縮、手臂升降和手臂回轉4個自由度,能夠滿足一般的工業(yè)要求。
該機械手由電位器定位,實行點位控制,控制系統(tǒng)采用PLC可編程控制,具有良好的通用性和靈活性。
該機械手為液壓驅動,4個自由度和手爪的夾緊都由液壓缸驅動,在油路的布置和規(guī)劃中結合機械制造的基礎,不斷使油路符合制造的可行性,而且將油路布置成空間結構,使機械手的結構更加簡潔和緊湊。
關鍵字:關節(jié)型機械手 圓柱坐標 液壓缸 可編程控制
Abstract
In this paper, the design of the joint-type robot using cylindrical coordinates of type, can be completed on the expected, inversion and other functions. Mainly by the manipulator hand, wrist, arm and body parts, etc., with rotating wrists, arms stretching, arm movements and arm rotation four degrees of freedom, able to meet the general requirements of the industry.
The manipulator by the potentiometer position, the implementation of the control points, the control system using PLC programmable control, has a good generality and flexibility.
The manipulator for the hydraulic-driven, four degrees of freedom and the clamping gripper driven by the hydraulic cylinder in the circuit layout and planning based on the combination of machinery manufacturing, and continuously so that the feasibility of manufacturing in line with the circuit, but also circuit layout into a spatial structure, so that the structure of manipulator more concise and compact.
Keywords: joint-type robot cylindrical coordinates hydraulic cylinders PLC.
目 錄
摘要 …………………………………………………………………………………i
Abstract ……………………………………………………………………………ii
1 緒論 ……………………………………………………………………… 1
1.1 研究目的及意義 ………………………………………………………… 1
1.2 本課題研究內容 ………………………………………………………… 2
2 機械手的總體設計 ……………………………………………………… 3
2.1 工業(yè)機械手的組成 ……………………………………………………… 3
2.1.1 執(zhí)行機構 ………………………………………………………… 3
2.1.2 驅動機構 ……………………………………………………………4
2.1.3 控制系統(tǒng) ………………………………………………………… 4
2.2 關節(jié)型機械手的主要技術參數 ………………………………………… 4
2.3 圓柱坐標式機械手運動簡圖………………………………………………5
3 關節(jié)型機械手機械系統(tǒng)設計 ……………………………………………6
3.1 手部 ………………………………………………………………………6
3.1.1 夾緊力的計算 ……………………………………………………6
3.1.2 夾緊缸驅動力計算 ………………………………………………7
3.1.3 兩支點回轉型手指的夾持誤差分析與計算 ……………………8
3.1.4 夾緊缸的計算 ……………………………………………………10
3.2 腕部 ………………………………………………………………………11
3.2.1 腕部設計的基本要求 ……………………………………………11
3.2.2 腕部回轉力矩的計算 ……………………………………………12
3.2.3 手腕回轉缸的設計計算 …………………………………………14
3.3 臂部 ………………………………………………………………………15
3.3.1 手臂伸縮液壓缸 …………………………………………………15
3.3.2 手臂回轉液壓缸 …………………………………………………23
4 機械手的液壓驅動系統(tǒng) …………………………………………………27
4.1 程序控制機械手的液壓系統(tǒng) ……………………………………………27
4.2 液壓系統(tǒng) …………………………………………………………………27
4.2.1 各液壓缸的換壓回路 ……………………………………………27
4.2.2 調速方案 …………………………………………………………28
4.2.3 減速緩沖回路 ……………………………………………………29
4.3 液壓系統(tǒng)的合成 …………………………………………………………29
5 機械手的可編程控制 ……………………………………………………31
5.1 輸入輸出觸點的分配 ……………………………………………………31
5.1.1 行程開關的分配 …………………………………………………31
5.1.2 手動按鈕的分配 …………………………………………………31
5.1.3 輸入輸出繼電器的分配 …………………………………………32
5.2 外部接線圖 ………………………………………………………………32
5.3 控制面板設計 ……………………………………………………………33
5.4 狀態(tài)控制圖 ………………………………………………………………34
5.5 梯形圖 ……………………………………………………………………35
結論 …………………………………………………………………………………37
致謝 …………………………………………………………………………………38
參考文獻 ……………………………………………………………………………39
iv
Extending Blender: Development of a Haptic Authoring Tool
Abstract -In this paper, we present our work to extend a well known 3D graphic modeler - Blender - to support haptic modeling and rendering. The extension tool is named HAMLAT (Haptic Application Markup Language Authoring Tool). We describe the modifications and additions to the Blender source code which have been used to create HAMLAT Furthermore, we present and discuss the design decisions used when developing HAMLAT, and also an implementation "road map" which describes the changes to the Blender source code. Finally, we conclude
with discussion of our future development and research avenues.
Keywords - Haptics, HAML, Graphic Modelers, Blender, Virtual Environments.
I. INTRODUCTION
A. Motivation
The increasing adoption of haptic modality in human-computer interaction paradigms has led to a huge demand for new tools that help novice users to author and edit haptic applications. Currently, the haptic application development process is a time consuming experience that requires programming expertise. The complexity of haptic applications development rises from the fact that the haptic application components (such as the haptic API, the device, the haptic rendering algorithms, etc.) need to interact with the graphic components in order to achieve synchronicity.
Additionally, there is a lack of application portability as the application is tightly coupled to a specific device that necessitates the use of its corresponding API. Therefore, device and API heterogeneity lead to the fragmentation and disorientation of both researchers and developers. In view of all these considerations, there is a clear need for an authoring tool that can build haptic applications while hiding programming details from the application modeler (such as API, device, or virtual model).
This paper describes the technical development of the Haptic Application Markup Language Authoring Tool (HAMLAT). It is intended to explain the design decisions used for developing HAMLAT and also provides an implementation "road map", describing the source code of the project.
B. Blender
HAMLAT is based on the Blender [1] software suite, which is an open-source 3D modeling package with a rich feature set. It has a sophisticated user interface which is
noted for its efficiency and flexibility, as well as its supports for multiple file formats, physics engine, modem computer graphic rendering and many other features.
Because of Blender's open architecture and supportive community base, it was selected as the platform of choice for development of HAMLAT. The open-source nature of Blender means HAMLAT can easily leverage its existing functionality and focus on integrating haptic features which make it a complete hapto-visual modeling tool, since developing a 3D modeling platform from scratch requires considerable development time and expertise in order to reach the level of functionality of Blender. Also, we can take advantage of future improvements to Blender by merging changes from its source code into the HAMLAT source tree.
HAMLAT builds on existing Blender components, such as the user-interface and editing tools, by adding new components which focus on the representation, modification, and rendering of haptic properties of objectsin a 3D scene. By using Blender as the basis for HAMLAT, we hope to develop a 3D haptic modeling tool
which has the maturity and features of Blender combined
with the novelty of haptic rendering.
At the time of writing, HAMLAT is based on Blender version 2.43 source code.
C. Project Goals
As previously stated, the overall goal for the HAMLAT project is to produce a polished software application which combines the features of a modem graphic modeling tool with haptic rendering techniques. HAMLAT has the "look and feel" of a 3D graphical modeling package, but with the addition of features such as haptic rendering and haptic property descriptors. This allows artists, modelers, and developers to generate realistic 3D hapto-visual virtual environments.
A high-level block diagram of HAMLAT is shown in Figure 1. It illustrates the flow of data in the haptic modeling. HAMLAT assists the modeler, or application developer, in building hapto-visual applications which may be stored in a database for later retrieval by another haptic application. By hapto-visual application we refer to any software which displays a 3D scene both visually and haptically to a user in a virtual setting. An XML file format, called HAML [2], is used to describe the 3D scenes and store the hapto-visual environments built by a modeler for later playback to an end user.
Traditionally, building hapto-visual environments has required a strong technical and programming background. The task of haptically rendering a 3D scene is tedious
since haptic properties must be assigned to individual objects in the scene and currently there are few high-level tools for accomplishing this task. HAMLAT bridges this gap by integrating into the HAML framework and delivering a complete solution for development of hapto- visual applications requiring no programming knowledge.
The remainder of the paper is organized as follows: in Section 2, we present the proposed architecture extensions and discuss design constraints. Section 3 describes the implementation details and how haptic properties are added and rendered within the Blender framework. In Section 4 we discuss related issues and future work avenues.
II. SYSTEM OVERVIEW AND ARCHITECTURE
The Blender design philosophy is based on three main tasks: data storage, editing, and visualization. According to the legacy documentation [3], it follows a data- visualize-edit development cycle for the 3D modeling pipe line. A 3D scene is represented using data structures within the Blender architecture. The modeler views the scene, makes changes using the editing interface which directly modifies the underlying data structures, and then the cycle repeats.
To better understand this development cycle, consider the representation of a 3D object in Blender. A 3D object may be represented by an array of vertices which have
been organized as a polygonal mesh. Users may choose to operate on any subset of this data set. Editing tasks may include operations to rotate, scale, and translate the
vertices, or perhaps a re-meshing algorithm to "cleanup" redundant vertices and transform from a quad to a triangle topology. The data is visualized using a graphical 3D renderer which is capable of displaying the object as a wireframe or as a shaded, solid surface. The visualization is necessary in order to see the effects of editing on the data. In a nutshell, this example defines the design philosophy behind Blender's architecture.
In Blender, data is organized as a series of lists and base data types are combined with links between items in each list, creating complex scenes from simple structures.
This allows data elements in each list to be reused, thus reducing the overall storage requirements. For example, a mesh may be linked by multiple scene objects, but the position and orientation may change for each object and the topology of the mesh remains the same. A diagram illustrating the organization of data structures and reuse of scene elements is shown in Figure 2. A scene object links to three objects, each of which link to two polygonal meshes. The meshes also share a common material property. The entire scene is rendered on one of several screens, which visualizes the scene.
We adopt the Blender design approach for our authoring tool. The data structures which are used to represent objects in a 3D scene have been augmented to include fields for haptic properties (e.g., stiffness, damping); user interface components (e.g., button panels) which allow the modeler to change object properties have also been updated to include support for modifying the haptic properties of an object. Additionally, an interactive hapto-visual renderer has been implemented to display the
3D scene graphically and haptically, providing the modeler or artist with immediate feedback about the changes they make to the scene. in the current version of the HAMLAT. the modifications to the Blender framework include: data structures for representing haptic properties,
an editing interface for modifying haptic properties, an external renderer for displaying and previewing haptically enabled scenes, scripts which allow scenes to be imported/exported in the HAML file format.
A class diagram outlining the changes to the Blender ramework is shown in Figure 3. Components which are ertinent to HAMLAT are shaded in gray. HAMLAT builds on existing Blender sub-systems by extending them or haptic modeling purposes. Data structures for representing object geometry and graphical rendering areaugmented to include field which encompass the tactile properties necessary for haptic rendering.
To allow the user to modify haptic properties GUI Components are integrated as part of the Blender editing panels. The operations triggered by these components
operate directly on the d ata structures used for representing hatic cues and may be considered part of the editing step of the Blender design cycle.
Similarly to the built-in graphical renderer, HAMLAT uses a custom rendlerer for displaying 3Ds scenes grphcal and haptcall, an is ineedn of the Blender renderer. This component is developed independently since haptical and graphical rendering must be performed simultaneously and synchronously. A simulation loop is used to update haptic rendering forces at a rate which maintains stability and quality. A detailed discussion of the implementation of these classes and their connectivity is given in the next section.
III IMLIEMENTATION
A Data Structure
A.1 Mesh Data Type
Blender uses many different data structures to represent the various types of objects in a 3D scene a vertices; a lamp contains colour and intensity values; and camera a object contains intrinsic viewing parameters.
The Mesh data structure iS used by the Blender inframework to describe a polygonal mesh object. It iS of particular interest for hapic rendering since many solid objects in a 3D scene may be represented using this type of data structure. The tactile and kinesthetic cues, which are displayed due to interaction with virtual objects, are typically rendered based on the geometry of the mesh. Hptic rendering is performed based primary on data stored in this data type. Other scene components such as lamps, cameras, or lines are not intuitively rendered using force feedback haptic devices and are therefore not of current interest for haptic rendering.
An augmented version of the Mesh data structure is shown in Figure 4. It contains fields for vertex and face data, plus some special custom data fields which allow data to be stored to/retrieved from disk and memory. We have modified this data type to include a pointer to a MHaptics data structure, which stores haptic properties such as stiffness, damping, and friction for the mesh elements (Figure 5).
A.2 Edit Mesh Data Type
It should be noted that the Mesh data type has a comPlimentary data structure, called EditMesh, which is used when editing mesh data. It holds a copy of the vertex, edge ,and face data for a polygonal mesh. when the user switches to editing mode, the Blender copies the data from a Mesh into an EditMesh and when editing is complete the data is copied back.
Care must be taken to ensure that the hapic property data structure remains intact during the copy sequence. The EditMesh data structure has not been modified to contain a copy of the hapic property data ,but this may
properties in edit mode is required). The editing mode is mainly used to modify mesh topology and geometry, not the haptic and graphical rendering characteristics,
A.3 Haptic Properties
In this section we'll briefly discuss the haptic properties which may currently be modeled using HAMLAT. It is important for the modeler to understand these
properties and their basis for use in haptic rendering.
The stiffness of an object defines how resistant it is to deformation by some applied force. Hard objects, such as a rock or table, have very high stiffness; soft objects, such as rubber ball, have low stiffness. The hardness-softness of an object is typically rendered using the spring-force equation:
Where the force feedback vector f which is displayed to the user is computed using ks the stiffness coefficient (variable name stiffness)for the object and x the penetration depth (displacement) of the haptic proxy into an object. The stiffness coefficient has a range of [0,1], where 0 represents no resistance to deformation and 1 represents the maximum stiffness which may be rendered by the haptic device. The damping of an object defines its resistance to the rate of deformation due to some applied force. It is typically rendered using the force equation:
Where kd is the damping coefficient (variable name}MHaptics; damping) anddepdt is the velocity ofthe haptic proxy as it;penetrates an object. The damping coefficient also has a range of [0,1] and may be used to model viscous behaviour of a material. It also increases the stability of the hapticrendering loop fordstiffmaterials.
The static friction (variable name stjriction) and dynamic friction (variable name dyjriction) coefficient are used to model the frictional forces experienced whilee
xploring the surface of a 3D object. Static friction is experienced when the proxy is not moving over the object's surface, and an initial force must be used to overcome static friction. Dynamic friction is felt when the proxy moves across the surface, rubbing against it.
Frictional coefficients also have a range of /0,1], with a value of 0 making the surface of a 3D object feel "slippery" and a value of 1 making the object feel very
rough. Frictional forces are typically rendered in a direction tangential to the collision point of the hapticproxy at an object's surface. B. Editing Blender uses a set of non-overlapping windows called spaces to modify various aspects of the 3D scene and its objects. Each space is divided into a set of areas andpanels which are context aware. That is, they provide functionality based on the selected object type. For
example, if a camera is selected the panel will display components which allow the modeler to change the focal length and viewing angle of the camera, but these components will not appear if an object of another type is selected.
Figure 6 shows a screen shot of the button space which is used to edit properties for a haptic mesh. It includes user-interface panels which allow a modeler to change the graphical shading properties of the mesh, perform simple re-meshing operations, and to modify the haptic properties of the selected mesh.
HAMLAT follows the context-sensitive behavior of Blender by only displaying the haptic editing panel when a polygonal mesh object is selected. In the future, this
panel may be duplicated to support haptic modeling for other object types, such as NURB surfaces. The Blender framework offers many user-interface components (e.g., buttons, sliders, pop-up menus) which may be used to edit the underlying data structures. The haptic properties for mesh objects are editable using sliders or by entering a float value into a text box located adjacent to the slider. When the value of the slider/text box is changed, it triggers an event in the Blender window sub-system. A unique identifier that the event is for the haptic property panel and the HAMLAT code should be called to update haptic properties for the currently selected mesh.
C Hapto-Visual Rendering
Blender currently support graphical rendering of scenes using an internal render or an external renderer (e.g., [4]). In this spirit, the haptic renderer used by HAMLAT has been developed as an exteral renderer. It uses the OpenGL and OpenHaptics toolkit [5] to perform graphic and hapic rendering ,respectively.
The 3D scene which is being modeled is rendered using two passes: the first pass render the scene graphically, and the second pass renders it haptically. The second pass is required because the OpenHaptics toolkit intercepts commands send to the OpenGL pipeline and uses them to display the scene using haptic rendering techniques. In this pass, the haptic properties of each mesh object are used much in the same way color and lighting are used by graphical rendering they define the
type of material for each object. To save CPU cycles, the lighting and graphical material properties are excluded from the haptic rendering pass.
Figure 7 shows source code which is used to apply the material properties during the haptic rendering pass. The haptic renderer is independent from the Blender
framework in that it exists outside the original source code. However, it is still heavily dependent on Blender data structures and types.
D. Scripting
The Blender Python (BPy) wrapper exposes many of the internal data structures, giving the internal Python scripting engine may access them. Similar to the data
structures used for representing mesh objects in the native Blender framework, wrappers allow user defined scripts to access and modify the elements in a 3D scene.
The hapic properties of a mesh object may be accessed through the Mesh wrapper class. A haptics attribute has been added to each of these classes and accessed through the Python scripting system. Figure 8 shows Python code to read the haptic properties from a mesh object and export to a file. Similar code is used to import/export HAML scenes from/to files.
An import script allows 3D scenes to be read from a HAML file and reproduced in the HAMLAT application; export script allows 3D scenes to be written to a HAML file, including haptic properties, and used in other
HAML applications.
The BPy wrappers also expose the Blender windowing system. Figure 9 shows a panel which appears when the user exports a 3D scene to the HAML file format. It
allows the user to specify supplementary information about the application such as a description, target hardware, and system requirements.
These are fields defined by the HAML specification [2] and are included with the authored scene as part of the HAML file format. User-interface components displayed on this panel are easily extended to agree with the future revisions of HAML.
The current version of HAMLAT shows that a unified modeling tool for graphics and haptics is possible. Promisingly, the features for modeling haptic properties
have been integrated seamlessly into the Blender framework, which indicates it was a good choice as a platform for development of this tool. Blender's modular architecture will make future additions to its framework very straightforward.
Currently, HAMLAT supports basic functionality for modeling and rendering hapto-visual applications. Scenes may be created, edited, previewed, and exported as part of a database for use in by other hapto-visual applications, such as the HAML player [6]. However, there is room for growth and in there are many more ways we can continue leveraging existing Blender functionality.
As per future work ,we plan to extend HAMLAT TO include support for other haptic platforms and devices.Currently, only the PHANTOM series of devices is supported since the interactive renderer is dependent on the OpenHaptics toolkit [5]. In order to support otherd evices, a cross-platform library such as Chai3D or
Haptik may be used to perform rendering. These libraries support force rendering for a large range of haptic hardware. Fortunately, due to the modularity of our implementation, only the interactive haptic rendering component need be altered for these changes.
In addition to support multiple hardware platforms, a user interface co
收藏