Eric S. Johansson writes:
Control means there is a task specific interface in place. For
example,
something like trace route would need some way of incrementally building the
commands, memorizing it with a specific name and then an argument which
translates a name to a hostname.
I assume you can access menus?
One thing that's been on the agenda for a while is a menu editor. A
Lisp menu is just a sequence of sequences, something like
(defconst default-menubar
;; This is backquoted; a lambda with a preceding , will be byte-compiled.
`(("%_File"
["%_Open..." find-file]
("Open with Specified %_Encoding"
:filter
,#'(lambda (menu)
(coding-system-menu-filter
(lambda (entry)
(let ((coding-system-for-read entry))
(call-interactively 'find-file)))
(lambda (entry) t)
t)))
"-----"
["%_Save" save-buffer
:active (buffer-modified-p)
:suffix (if put-buffer-names-in-file-menu (buffer-name) "")]
("%_Edit"
["%_Undo" undo
:active (and (not (eq buffer-undo-list t))
(or buffer-undo-list pending-undo-list))
:suffix (if (eq last-command 'undo) "More" "")]
"----"
["Cu%_t" kill-primary-selection
:active (selection-owner-p)]
["%_Paste" yank-clipboard-selection
:active (selection-exists-p 'CLIPBOARD)]
["%_Delete" delete-primary-selection
:active (selection-owner-p)]
)))
This is very complicated, it's a selection from the main menubar code.
It doesn't get more complicated than that. :-) But Lisp is very good
at editing structures like this, and we should be able to create a
menu editor that navigates the menus just like selecting a command.
It would be substantially easier if it didn't need to handle features
like:active and :suffix.
XEmacs also has a "keyboard macro" facility, but it could use some
improvement for your purposes.
Emacs because I do need a particular edit control if I'm going to
edit using
speech recognition. The concept I'm trying to exploit is putting a
transformation filter between the environment holding non-speakable data and a
speakable editing environment. Outbound, it translates to an English like format
and the return is the inverse (Idemopotent transform)
[snip]
I need to go back over these concepts and think about what does it
mean when the
editing environment is on a remote machine and the speech recognition
environment is local. I'd like to be able to run a remote Emacs so that things
like Shell and other tools could be used with whatever accessibility features
are developed.
I'm not sure how that would work. XEmacs does have TTY and
stream-oriented consoles (it's painful, but you can actually work with
XEmacs using only stdin and stdout ... it's actually not unreasonable
for playing "doctor" or "dunnet"). Maybe the stream-oriented console
could attach to the voice recognition.
_______________________________________________
XEmacs-Beta mailing list
XEmacs-Beta(a)xemacs.org
http://lists.xemacs.org/mailman/listinfo/xemacs-beta